The Voice of Emotion Across Species: How Do Human Listeners Recognize Animals' Affective States?

PLoS One. 2014 Mar 12;9(3):e91192. doi: 10.1371/journal.pone.0091192. eCollection 2014.

Abstract

Voice-induced cross-taxa emotional recognition is the ability to understand the emotional state of another species based on its voice. In the past, induced affective states, experience-dependent higher cognitive processes or cross-taxa universal acoustic coding and processing mechanisms have been discussed to underlie this ability in humans. The present study sets out to distinguish the influence of familiarity and phylogeny on voice-induced cross-taxa emotional perception in humans. For the first time, two perspectives are taken into account: the self- (i.e. emotional valence induced in the listener) versus the others-perspective (i.e. correct recognition of the emotional valence of the recording context). Twenty-eight male participants listened to 192 vocalizations of four different species (human infant, dog, chimpanzee and tree shrew). Stimuli were recorded either in an agonistic (negative emotional valence) or affiliative (positive emotional valence) context. Participants rated the emotional valence of the stimuli adopting self- and others-perspective by using a 5-point version of the Self-Assessment Manikin (SAM). Familiarity was assessed based on subjective rating, objective labelling of the respective stimuli and interaction time with the respective species. Participants reliably recognized the emotional valence of human voices, whereas the results for animal voices were mixed. The correct classification of animal voices depended on the listener's familiarity with the species and the call type/recording context, whereas there was less influence of induced emotional states and phylogeny. Our results provide first evidence that explicit voice-induced cross-taxa emotional recognition in humans is shaped more by experience-dependent cognitive mechanisms than by induced affective states or cross-taxa universal acoustic coding and processing mechanisms.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Affect*
  • Animals
  • Dogs
  • Humans
  • Infant
  • Male
  • Pan troglodytes
  • Perception
  • Recognition, Psychology*
  • Species Specificity
  • Tupaiidae
  • Vocalization, Animal*
  • Young Adult

Grant support

The study was financially supported from the Deutsche Forschungsgemeinschaft (FOR 499; www.dfg.de). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.