The ability to predict upcoming structured events based on long-term knowledge and contextual priors is a fundamental principle of human cognition. Tonal music triggers predictive processes based on structural properties of harmony, i.e., regularities defining the arrangement of chords into well-formed musical sequences. While the neural architecture of structure-based predictions during music perception is well described, little is known about the neural networks for analogous predictions in musical actions and how they relate to auditory perception. To fill this gap, expert pianists were presented with harmonically congruent or incongruent chord progressions, either as musical actions (photos of a hand playing chords) that they were required to watch and imitate without sound, or in an auditory format that they listened to without playing. By combining task-based functional magnetic resonance imaging (fMRI) with functional connectivity at rest, we identified distinct sub-regions in right inferior frontal gyrus (rIFG) interconnected with parietal and temporal areas for processing action and audio sequences, respectively. We argue that the differential contribution of parietal and temporal areas is tied to motoric and auditory long-term representations of harmonic regularities that dynamically interact with computations in rIFG. Parsing of the structural dependencies in rIFG is co-determined by both stimulus- or task-demands. In line with contemporary models of prefrontal cortex organization and dual stream models of visual-spatial and auditory processing, we show that the processing of musical harmony is a network capacity with dissociated dorsal and ventral motor and auditory circuits, which both provide the infrastructure for predictive mechanisms optimising action and perception performance.
Keywords: Functional connectivity; Harmony; IFG; Music; Prediction; Syntax.
Copyright © 2016 Elsevier Inc. All rights reserved.