Continuous perception and graded categorization: electrophysiological evidence for a linear relationship between the acoustic signal and perceptual encoding of speech

Psychol Sci. 2010 Oct;21(10):1532-40. doi: 10.1177/0956797610384142. Epub 2010 Oct 8.

Abstract

Speech sounds are highly variable, yet listeners readily extract information from them and transform continuous acoustic signals into meaningful categories during language comprehension. A central question is whether perceptual encoding captures acoustic detail in a one-to-one fashion or whether it is affected by phonological categories. We addressed this question in an event-related potential (ERP) experiment in which listeners categorized spoken words that varied along a continuous acoustic dimension (voice-onset time, or VOT) in an auditory oddball task. We found that VOT effects were present through a late stage of perceptual processing (N1 component, ~100 ms poststimulus) and were independent of categorization. In addition, effects of within-category differences in VOT were present at a postperceptual categorization stage (P3 component, ~450 ms poststimulus). Thus, at perceptual levels, acoustic information is encoded continuously, independently of phonological information. Further, at phonological levels, fine-grained acoustic differences are preserved along with category information.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Adolescent
  • Adult
  • Attention / physiology*
  • Cerebral Cortex / physiology
  • Dominance, Cerebral / physiology
  • Electroencephalography
  • Event-Related Potentials, P300 / physiology*
  • Evoked Potentials, Auditory / physiology*
  • Female
  • Humans
  • Linear Models*
  • Male
  • Phonetics*
  • Semantics
  • Speech Acoustics*
  • Speech Perception / physiology*
  • Young Adult