Real-time lexical comprehension in young children learning American Sign Language

Dev Sci. 2018 Nov;21(6):e12672. doi: 10.1111/desc.12672. Epub 2018 Apr 16.

Abstract

When children interpret spoken language in real time, linguistic information drives rapid shifts in visual attention to objects in the visual world. This language-vision interaction can provide insights into children's developing efficiency in language comprehension. But how does language influence visual attention when the linguistic signal and the visual world are both processed via the visual channel? Here, we measured eye movements during real-time comprehension of a visual-manual language, American Sign Language (ASL), by 29 native ASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. All signers showed evidence of rapid, incremental language comprehension, tending to initiate an eye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns, suggesting that the in-the-moment dynamics of eye movements during ASL processing are shaped by the constraints of processing a visual language in real time and not by differential access to auditory information in day-to-day life. Finally, variation in children's ASL processing was positively correlated with age and vocabulary size. Thus, despite competition for attention within a single modality, the timing and accuracy of visual fixations during ASL comprehension reflect information processing skills that are important for language acquisition regardless of language modality.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Attention
  • Child, Preschool
  • Comprehension*
  • Deafness
  • Eye Movements / physiology
  • Humans
  • Infant
  • Language Development
  • Learning*
  • Linguistics / methods
  • Sign Language*
  • United States
  • Vision, Ocular / physiology