Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion

J Vis. 2013 Mar 18;13(4):13. doi: 10.1167/13.4.13.

Abstract

Facial expressions of emotion are essential components of human behavior, yet little is known about the hierarchical organization of their cognitive analysis. We study the minimum exposure time needed to successfully classify the six classical facial expressions of emotion (joy, surprise, sadness, anger, disgust, fear) plus neutral as seen at different image resolutions (240 × 160 to 15 × 10 pixels). Our results suggest a consistent hierarchical analysis of these facial expressions regardless of the resolution of the stimuli. Happiness and surprise can be recognized after very short exposure times (10-20 ms), even at low resolutions. Fear and anger are recognized the slowest (100-250 ms), even in high-resolution images, suggesting a later computation. Sadness and disgust are recognized in between (70-200 ms). The minimum exposure time required for successful classification of each facial expression correlates with the ability of a human subject to identify it correctly at low resolutions. These results suggest a fast, early computation of expressions represented mostly by low spatial frequencies or global configural cues and a later, slower process for those categories requiring a more fine-grained analysis of the image. We also demonstrate that those expressions that are mostly visible in higher-resolution images are not recognized as accurately. We summarize implications for current computational models.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Adult
  • Analysis of Variance
  • Emotions*
  • Facial Expression*
  • Female
  • Humans
  • Male
  • Photic Stimulation / methods
  • Recognition, Psychology*
  • Sensory Thresholds / physiology
  • Time Factors
  • Young Adult