Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 Sep 2:14:43.
doi: 10.3389/fnsys.2020.00043. eCollection 2020.

EEG-Based Emotion Classification Using a Deep Neural Network and Sparse Autoencoder

Affiliations

EEG-Based Emotion Classification Using a Deep Neural Network and Sparse Autoencoder

Junxiu Liu et al. Front Syst Neurosci. .

Abstract

Emotion classification based on brain-computer interface (BCI) systems is an appealing research topic. Recently, deep learning has been employed for the emotion classifications of BCI systems and compared to traditional classification methods improved results have been obtained. In this paper, a novel deep neural network is proposed for emotion classification using EEG systems, which combines the Convolutional Neural Network (CNN), Sparse Autoencoder (SAE), and Deep Neural Network (DNN) together. In the proposed network, the features extracted by the CNN are first sent to SAE for encoding and decoding. Then the data with reduced redundancy are used as the input features of a DNN for classification task. The public datasets of DEAP and SEED are used for testing. Experimental results show that the proposed network is more effective than conventional CNN methods on the emotion recognitions. For the DEAP dataset, the highest recognition accuracies of 89.49% and 92.86% are achieved for valence and arousal, respectively. For the SEED dataset, however, the best recognition accuracy reaches 96.77%. By combining the CNN, SAE, and DNN and training them separately, the proposed network is shown as an efficient method with a faster convergence than the conventional CNN.

Keywords: EEG; convolutional neural network; deep neural network; emotion recognition; sparse autoencoder.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Emotion classification procedure in this work.
Figure 2
Figure 2
The autoencoder includes one input, one hidden, and one output layer.
Figure 3
Figure 3
The proposed network includes the CNN, SAE, and DNN; the CNN and SAE are used for feature extraction, and the DNN is used for classification.
Figure 4
Figure 4
Configuration of the proposed network for the DEAP dataset.
Figure 5
Figure 5
Change of loss when data are reconstructed in the SAE on the DEAP dataset.
Figure 6
Figure 6
Accuracy comparison of two networks on valence using data with a length of 8 s on the DEAP dataset in which (A) is result of the CNN and (B) the result of the proposed network.
Figure 7
Figure 7
Accuracy comparison of two networks on arousal using data with a length of 8 s on the DEAP dataset in which (A) is result of the CNN and (B) the result of the proposed network.
Figure 8
Figure 8
Accuracy comparison of two networks on valence using data with a length of 12 s on DEAP dataset in which (A) is the result of the CNN and (B) the result of the proposed network.
Figure 9
Figure 9
Accuracy comparison of two networks on arousal using data with a length of 12 s on the DEAP dataset in which (A) is the result of the CNN and (B) the result of the proposed network.
Figure 10
Figure 10
Configuration of the proposed network for the SEED dataset.
Figure 11
Figure 11
Change of loss when data are reconstructed by the SAE on the SEED dataset.
Figure 12
Figure 12
Accuracy comparison of two networks using data with length of 8 s on the SEED dataset in which (A) is the result of the CNN and (B) the result of the proposed network.
Figure 13
Figure 13
Accuracy comparison of two networks using data with a length of 12 s on the SEED dataset in which (A) is the result of the CNN and (B) the result of the proposed network.
Figure 14
Figure 14
Accuracy comparison of two networks using features extracted from different lengths of overlap on the SEED dataset in which (A) is the result of the CNN and (B) the result of the proposed network.

Similar articles

Cited by

References

    1. Abadi M., Barham P., Chen J., Chen Z., Davis A., Dean J., et al. (2016). Tensorflow: a system for large-scale machine learning, in 12th USENIX Symposium on Operating Systems Design and Implementation (Savannah, GA: USENIX Association; ), 265–283.
    1. Agrafioti F., Hatzinakos D., Anderson A. K. (2012). ECG pattern analysis for emotion detection. IEEE Trans. Affect. Comput. 3, 102–115. 10.1109/T-AFFC.2011.28 - DOI
    1. Bahari F., Janghorbani A. (2013). EEG-based emotion recognition using recurrence plot analysis and k nearest neighbor classifier, in 2013 20th Iranian Conference on Biomedical Engineering (ICBME) (Tehran: IEEE; ), 228–233. 10.1109/ICBME.2013.6782224 - DOI
    1. Bohgaki T., Katagiri Y., Usami M. (2014). Pain-relief effects of aroma touch therapy with citrus junos oil evaluated by quantitative EEG occipital alpha-2 rhythm powers. J. Behav. Brain Sci. 4, 11–22. 10.4236/jbbs.2014.41002 - DOI
    1. Cheng B., Liu G. (2008). Emotion recognition from surface EMG signal using wavelet transform and neural network, in 2008 2nd International Conference on Bioinformatics and Biomedical Engineering (Shanghai: IEEE; ), 1363–1366. 10.1109/ICBBE.2008.670 - DOI

LinkOut - more resources