Subjective Evaluation of Basic Emotions from Audio-Visual Data

Sensors (Basel). 2022 Jun 29;22(13):4931. doi: 10.3390/s22134931.

Abstract

Understanding of the perception of emotions or affective states in humans is important to develop emotion-aware systems that work in realistic scenarios. In this paper, the perception of emotions in naturalistic human interaction (audio-visual data) is studied using perceptual evaluation. For this purpose, a naturalistic audio-visual emotion database collected from TV broadcasts such as soap-operas and movies, called the IIIT-H Audio-Visual Emotion (IIIT-H AVE) database, is used. The database consists of audio-alone, video-alone, and audio-visual data in English. Using data of all three modes, perceptual tests are conducted for four basic emotions (angry, happy, neutral, and sad) based on category labeling and for two dimensions, namely arousal (active or passive) and valence (positive or negative), based on dimensional labeling. The results indicated that the participants' perception of emotions was remarkably different between the audio-alone, video-alone, and audio-video data. This finding emphasizes the importance of emotion-specific features compared to commonly used features in the development of emotion-aware systems.

Keywords: emotion analysis; emotion recognition; emotion synthesis; feature extraction; naturalistic audio–visual emotion database.

MeSH terms

  • Arousal*
  • Emotions*
  • Humans

Grants and funding

This research was partly funded by Academy of Finland grant number 313390.