Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2019 Jan 8;19(1):210.
doi: 10.3390/s19010210.

Validating Deep Neural Networks for Online Decoding of Motor Imagery Movements from EEG Signals

Affiliations

Validating Deep Neural Networks for Online Decoding of Motor Imagery Movements from EEG Signals

Zied Tayeb et al. Sensors (Basel). .

Abstract

Non-invasive, electroencephalography (EEG)-based brain-computer interfaces (BCIs) on motor imagery movements translate the subject's motor intention into control signals through classifying the EEG patterns caused by different imagination tasks, e.g., hand movements. This type of BCI has been widely studied and used as an alternative mode of communication and environmental control for disabled patients, such as those suffering from a brainstem stroke or a spinal cord injury (SCI). Notwithstanding the success of traditional machine learning methods in classifying EEG signals, these methods still rely on hand-crafted features. The extraction of such features is a difficult task due to the high non-stationarity of EEG signals, which is a major cause by the stagnating progress in classification performance. Remarkable advances in deep learning methods allow end-to-end learning without any feature engineering, which could benefit BCI motor imagery applications. We developed three deep learning models: (1) A long short-term memory (LSTM); (2) a spectrogram-based convolutional neural network model (CNN); and (3) a recurrent convolutional neural network (RCNN), for decoding motor imagery movements directly from raw EEG signals without (any manual) feature engineering. Results were evaluated on our own publicly available, EEG data collected from 20 subjects and on an existing dataset known as 2b EEG dataset from "BCI Competition IV". Overall, better classification performance was achieved with deep learning models compared to state-of-the art machine learning techniques, which could chart a route ahead for developing new robust techniques for EEG signal decoding. We underpin this point by demonstrating the successful real-time control of a robotic arm using our CNN based BCI.

Keywords: Brain-Computer Interfaces; Deep Learning; electroencephalography (EEG); long short-term memory (LSTM); recurrent convolutional neural network (RCNN); spectrogram-based convolutional neural network model (pCNN).

PubMed Disclaimer

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships, which could be construed as a potential conflict of interest.

Figures

Figure 1
Figure 1
Experimental setup and an example of a recording session of motor imagery-electroencephalography (MI-EEG) recording.
Figure 2
Figure 2
Example of the generated spectrograms from C3 and C4 electrode during left (class 1) and right (class 2) hand movements imagination.
Figure 3
Figure 3
The pragmatic conventional neural network (pCNN) model’s architecture, where E is the number of electrodes, T is the number of timesteps, and K is the number of classes.
Figure 4
Figure 4
Training and validation loss of the pCNN model. The blue and green lines represent the average of the 5 folds for training and validation, respectively.
Figure 5
Figure 5
MI classification accuracies from 20 subjects using (a) traditional machine learning approaches and (b) different neural classifiers. The polar bar plot shows the accuracy range (mean ± standard deviation) achieved by the 5 models for each of the 20 subjects. The lower panel subsumes for each algorithm the 20 mean accuracies achieved, black bars indicate the median result.
Figure 6
Figure 6
MI classification accuracies from nine subjects using five different classifiers. The polar bar plot shows the accuracy range (mean ± standard deviation) achieved by the five models for each of the nine subjects. The lower panel subsumes for each algorithm the nine mean accuracies achieved, black bars indicate the median result.
Figure 7
Figure 7
A frame of a live stream. Top: Filtered signal during a trial. Blue and red traces illustrate channel 1 and channel 2, respectively. Vertical lines indicate visual (orange) and acoustic cues (red). Bottom: Generated spectrograms from data within the grey rectangle shown above.
Figure 8
Figure 8
Live setup for real-time EEG signal decoding and Katana robot arm control. P(L) and P(R) represent the probability of left and right hand movements, respectively.

Similar articles

Cited by

References

    1. Meng J., Zhang S., Bekyo A., Olsoe J., Baxter B., He B. Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks. Sci. Rep. 2016;6:2045–2322. doi: 10.1038/srep38565. - DOI - PMC - PubMed
    1. Carlson T., del R., Millan J. Brain-Controlled Wheelchairs: A Robotic Architecture. IEEE Robot. Autom. Mag. 2013;20:65–73. doi: 10.1109/MRA.2012.2229936. - DOI
    1. Lebedev M.A., Nicolelis M.A.L. Brain-Machine Interfaces: From Basic Science to Neuroprostheses and Neurorehabilitation. Physiol. Rev. 2017;97:767–837. doi: 10.1152/physrev.00027.2016. - DOI - PubMed
    1. Hochreiter S., Schmidhuber J. Long short-term memory. Neural Comput. 1997;9:1735–1780. doi: 10.1162/neco.1997.9.8.1735. - DOI - PubMed
    1. Lecun Y., Bengio Y., Hinton G. Deep learning. Nature. 2015;521:436–444. doi: 10.1038/nature14539. - DOI - PubMed

LinkOut - more resources