Brain activity during audiovisual speech perception: an fMRI study of the McGurk effect

Neuroreport. 2003 Jun 11;14(8):1129-33. doi: 10.1097/00001756-200306110-00006.

Abstract

fMRI was used to assess the relationship between brain activation and the degree of audiovisual integration of speech information during a phoneme categorization task. Twelve subjects heard a speaker say the syllable /aba/ paired either with video of the speaker saying the same consonant or a different one (/ava/). In order to manipulate the degree of audiovisual integration, the audio was either synchronous or +/- 400 ms out of phase with the visual stimulus. Subjects reported whether they heard the consonant /b/ or another consonant; fewer /b/ responses when the audio and visual stimuli were mismatched indicated higher levels of visual influence on speech perception (McGurk effect). Active brain regions during presentation of the incongruent stimuli included the superior temporal and inferior frontal gyrus, as well as extrastriate, premotor and posterior parietal cortex. A regression analysis related the strength of the McGurk effect to levels of brain activation. Paradoxically, higher numbers of /b/ responses were positively correlated with activation in the left occipito-temporal junction, an area often associated with processing visual motion. This activation suggests that auditory information modulates visual processing to affect perception.

Publication types

  • Comparative Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Audiovisual Aids
  • Brain Mapping*
  • Cerebral Cortex / anatomy & histology
  • Cerebral Cortex / physiology*
  • Female
  • Humans
  • Lipreading*
  • Magnetic Resonance Imaging / instrumentation
  • Magnetic Resonance Imaging / methods*
  • Male
  • Middle Aged
  • Paired-Associate Learning
  • Phonetics
  • Photic Stimulation
  • Regression Analysis
  • Speech Perception / physiology*
  • Visual Perception / physiology*