On the linear relation between the mean and the standard deviation of a response time distribution

Psychol Rev. 2007 Jul;114(3):830-41. doi: 10.1037/0033-295X.114.3.830.

Abstract

Although it is generally accepted that the spread of a response time (RT) distribution increases with the mean, the precise nature of this relation remains relatively unexplored. The authors show that in several descriptive RT distributions, the standard deviation increases linearly with the mean. Results from a wide range of tasks from different experimental paradigms support a linear relation between RT mean and RT standard deviation. Both R. Ratcliff's (1978) diffusion model and G. D. Logan's (1988) instance theory of automatization provide explanations for this linear relation. The authors identify and discuss 3 specific boundary conditions for the linear law to hold. The law constrains RT models and supports the use of the coefficient of variation to (a) compare variability while controlling for differences in baseline speed of processing and (b) assess whether changes in performance with practice are due to quantitative speedup or qualitative reorganization.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Automatism / psychology
  • Decision Making
  • Humans
  • Linear Models*
  • Memory
  • Mental Processes
  • Pattern Recognition, Visual
  • Practice, Psychological
  • Problem Solving
  • Reaction Time*
  • Statistics as Topic