Distinguishing Different Emotions Evoked by Music via Electroencephalographic Signals

Comput Intell Neurosci. 2019 Mar 6:2019:3191903. doi: 10.1155/2019/3191903. eCollection 2019.

Abstract

Music can evoke a variety of emotions, which may be manifested by distinct signals on the electroencephalogram (EEG). Many previous studies have examined the associations between specific aspects of music, including the subjective emotions aroused, and EEG signal features. However, no study has comprehensively examined music-related EEG features and selected those with the strongest potential for discriminating emotions. So, this paper conducted a series of experiments to identify the most influential EEG features induced by music evoking different emotions (calm, joy, sad, and angry). We extracted 27-dimensional features from each of 12 electrode positions then used correlation-based feature selection method to identify the feature set most strongly related to the original features but with lowest redundancy. Several classifiers, including Support Vector Machine (SVM), C4.5, LDA, and BPNN, were then used to test the recognition accuracy of the original and selected feature sets. Finally, results are analyzed in detail and the relationships between selected feature set and human emotions are shown clearly. Through the classification results of 10 random examinations, it could be concluded that the selected feature sets of Pz are more effective than other features when using as the key feature set to classify human emotion statues.

MeSH terms

  • Algorithms
  • Arousal / physiology
  • Auditory Perception / physiology*
  • Brain / physiology
  • Electroencephalography* / methods
  • Emotions / physiology*
  • Evoked Potentials
  • Humans
  • Music*
  • Support Vector Machine