Action observation triggers imitation, a powerful mechanism permitting interpersonal coordination. Coordination, however, also occurs when the partners' actions are nonimitative and physically incongruent. One influential theory postulates that this is achieved via top-down modulation of imitation exerted by prefrontal regions. Here, we rather argue that coordination depends on sharing a goal with the interacting partner: this shapes action observation, overriding involuntary imitation, through the predictive activity of the left ventral premotor cortex (lvPMc). During functional magnetic resonance imaging (fMRI), participants played music in turn with a virtual partner in interactive and noninteractive conditions requiring 50% of imitative/nonimitative responses. In a full-factorial design, both perceptual features and low-level motor requirements were kept constant throughout the experiment. Behaviorally, the interactive context minimized visuomotor interference due to the involuntary imitation of physically incongruent movements. This was paralleled by modulation of neural activity in the lvPMc, which was specifically recruited during the interactive task independently of the imitative/nonimitative nature of the social exchange. This lvPMc activity reflected the predictive decoding of the partner's actions, as revealed by multivariate pattern analysis. This demonstrates that, during interactions, we process our partners' behavior to prospectively infer their contribution to the shared goal achievement, generating motor predictions for cooperation beyond low-level imitation.
Keywords: Joint Action; MVPA; fMRI; motor prediction; ventral premotor cortex.
© The Author(s) 2019. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: email@example.com.