Event-related brain potentials (ERPs) were recorded to assess the processing time course of ambiguous facial expressions with a smiling mouth but neutral, fearful, or angry eyes, in comparison with genuinely happy faces (a smile and happy eyes) and non-happy faces (neutral, fearful, or angry mouth and eyes). Participants judged whether the faces looked truly happy or not. Electroencephalographic recordings were made from 64 scalp electrodes to generate ERPs. The neural activation patterns showed early P200 sensitivity (differences between negative and positive or neutral expressions) and EPN sensitivity (differences between positive and neutral expressions) to emotional valence. In contrast, sensitivity to ambiguity (differences between genuine and ambiguous expressions) emerged only in later LPP components. Discrimination of emotional vs. neutral affect occurs between 180 and 430ms from stimulus onset, whereas the detection and resolution of ambiguity takes place between 470 and 720ms. In addition, while blended expressions involving a smile with angry eyes can be identified as not happy in the P200 (175-240ms) component, smiles with fearful or neutral eyes produce the same ERP pattern as genuinely happy faces, thus revealing poor discrimination.
Copyright © 2012 Elsevier Inc. All rights reserved.