摘要:AbstractThis work is intended for create a system that integrates people with motor disabilities to an intelligent room prototype. This allows them to perform different applications inside a room through a Brain Computer Interface (BCI) based on biological signals. Electroencephalographic (EEG) signals are used as control signals. The applications in the room are: light control,bed inclination and requesting assistance with a buzzer. These signals are captured using the Ultracortex Mark IV acquisition system of the company OpenBCI connected to self-developed software. The gestures defined to perform the analysis are: left eye blink, right eye blink, eyebrow raise and no action. The data is filtered in the alpha band (8-13)Hz and is used to train an artificial intelligence model based on one-dimensional convolutional neural networks. For each gesture a control signal is generated and sent via Bluetooth Low Energy (BLE) to an embedded system in charge of the room prototype functions. At the first phase, the volunteers did not have any motor disabilities. The average accuracy value for eye blinking was 80% and for eyebrow raising 85% in the training model. Real-time classification showed an accuracy of 78.75% using five-second length windows.