Facial cues, such as a person's age, provide important information for social interactions. Processing such facial cues can be affected by observer bias. However, there is currently no consensus regarding how the brain is processing facial cues related to age, and if facial age processing changes as a function of the age of the observer (i.e., own-age bias). The primary study aim was to investigate functional networks involved in processing own-age vs. other-age faces among younger and older adults and determine how emotional expression of the face modulates own-age vs. other-age face processing. The secondary study aim was to examine the relation between higher social cognitive processes (i.e., empathy) and modulation of brain activity by facial age and emotional expression. During functional magnetic resonance imaging (fMRI) younger and older participants were asked to recognize happy, angry, and neutral expressions in own-age and other-age faces. Functional connectivity analyses with the amygdala as seed showed that for own-age faces both age groups recruited a network of regions including the anterior cingulate and anterior insula that was involved in empathy and detection of salient information. Brain-behavior analyses furthermore showed that empathic responses in younger, but not in older, participants were positively correlated with engagement of the medial prefrontal cortex during processing of angry own-age faces. These findings identify the neurobehavioral correlates of facial age processing, and its modulation by emotion expression, and directly link facial cue processing to higher-order social cognitive functioning.
Keywords: Amygdala; Default mode network; Emotional expression; Empathy; Functional connectivity; Own-age bias; Salience network.
Copyright © 2019 Elsevier Ltd. All rights reserved.