Biosignal-Based Multimodal Emotion Recognition in a Valence-Arousal Affective Framework Applied to Immersive Video Visualization

Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul:2019:3577-3583. doi: 10.1109/EMBC.2019.8857852.

Abstract

Many emotion recognition schemes have been proposed in the state-of-the-art. They generally differ in terms of the emotion elicitation methods, target emotional states to recognize, data sources or modalities, and classification techniques. In this work several biosignals are explored for emotion assessment during immersive video visualization, collecting multimodal data from Electrocardiography (ECG), Electrodermal Activity (EDA), Blood Volume Pulse (BVP) and Respiration sensors. Participants reported their emotional state of the day (baseline), and provided self-assessment of the emotion experienced in each video through the Self-Assessment Manikin (SAM), in the valence-arousal space. Multiple physiological and statistical features extracted from the biosignals were used as inputs to an emotion recognition workflow, targeting user-independent classification with two classes per dimension. Support Vector Machines (SVM) were used, as it is considered one of the most promising classifiers in the field. The proposed approach lead to accuracies of 69.13% for arousal and 67.75% for valence, which are encouraging for further research with a larger training dataset and population.

MeSH terms

  • Arousal*
  • Electrocardiography
  • Emotions*
  • Heart Rate
  • Humans
  • Respiration
  • Support Vector Machine*