An intentional stance modulates the integration of gesture and speech during comprehension

Brain Lang. 2007 Jun;101(3):222-33. doi: 10.1016/j.bandl.2006.07.008. Epub 2006 Sep 25.

Abstract

The present study investigates whether knowledge about the intentional relationship between gesture and speech influences controlled processes when integrating the two modalities at comprehension. Thirty-five adults watched short videos of gesture and speech that conveyed semantically congruous and incongruous information. In half of the videos, participants were told that the two modalities were intentionally coupled (i.e., produced by the same communicator), and in the other half, they were told that the two modalities were not intentionally coupled (i.e., produced by different communicators). When participants knew that the same communicator produced the speech and gesture, there was a larger bi-lateral frontal and central N400 effect to words that were semantically incongruous versus congruous with gesture. However, when participants knew that different communicators produced the speech and gesture--that is, when gesture and speech were not intentionally meant to go together--the N400 effect was present only in right-hemisphere frontal regions. The results demonstrate that pragmatic knowledge about the intentional relationship between gesture and speech modulates controlled neural processes during the integration of the two modalities.

Publication types

  • Controlled Clinical Trial
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Analysis of Variance
  • Brain Mapping
  • Cerebral Cortex / physiology*
  • Comprehension / physiology
  • Dominance, Cerebral
  • Evoked Potentials
  • Female
  • Gestures*
  • Humans
  • Intention*
  • Male
  • Psycholinguistics
  • Reaction Time
  • Speech Perception / physiology*