Emotional speech processing: disentangling the effects of prosody and semantic cues

Cogn Emot. 2011 Aug;25(5):834-53. doi: 10.1080/02699931.2010.516915. Epub 2011 May 24.

Abstract

To inform how emotions in speech are implicitly processed and registered in memory, we compared how emotional prosody, emotional semantics, and both cues in tandem prime decisions about conjoined emotional faces. Fifty-two participants rendered facial affect decisions (Pell, 2005a), indicating whether a target face represented an emotion (happiness or sadness) or not (a facial grimace), after passively listening to happy, sad, or neutral prime utterances. Emotional information from primes was conveyed by: (1) prosody only; (2) semantic cues only; or (3) combined prosody and semantic cues. Results indicated that prosody, semantics, and combined prosody-semantic cues facilitate emotional decisions about target faces in an emotion-congruent manner. However, the magnitude of priming did not vary across tasks. Our findings highlight that emotional meanings of prosody and semantic cues are systematically registered during speech processing, but with similar effects on associative knowledge about emotions, which is presumably shared by prosody, semantics, and faces.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Cues
  • Decision Making
  • Emotions*
  • Facial Expression
  • Female
  • Humans
  • Linguistics*
  • Male
  • Photic Stimulation
  • Psychomotor Performance
  • Reaction Time
  • Semantics*
  • Speech Perception*