Common brain representations of action and perception investigated with cross-modal classification of newly learned melodies

Sci Rep. 2025 May 12;15(1):16492. doi: 10.1038/s41598-025-00208-x.

Abstract

An important feature of human cognition is the ability to predict sensory outcomes of motor actions and infer actions from sensory information - a process enabled by action-perception coupling. Through repeated and consistent sensory feedback, bidirectional sensorimotor associations can become highly automatic with experience. In musicians, for instance, auditory cortex activity can increase spontaneously even when observing piano playing without auditory feedback. A key question is whether such associations rely on shared neural representations, or a "common code", between actions and their sensory outcomes. To test this, we trained non-musicians to play two melodies with different pitch sequences on the piano. The following day, they underwent an fMRI experiment with an MR-compatible piano while (a) playing the trained melodies without auditory feedback but imagining the sound, and (b) listening to the same melodies without playing but imagining the finger movements. Within-condition multivariate pattern analyses revealed that patterns of activity in auditory-motor regions represent pitch sequences. Importantly, cross-modal classification showed that these patterns generalized across conditions in the right premotor cortex, indicating the emergence of a common code across perception and action.

Keywords: Action-perception coupling; MVPA; Music perception; Sequence learning; fMRI.

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Auditory Cortex* / physiology
  • Auditory Perception* / physiology
  • Brain Mapping
  • Brain* / physiology
  • Feedback, Sensory / physiology
  • Female
  • Humans
  • Learning* / physiology
  • Magnetic Resonance Imaging
  • Male
  • Motor Cortex / physiology
  • Music*
  • Pitch Perception / physiology
  • Young Adult