Circular analysis in systems neuroscience: the dangers of double dipping

Nat Neurosci. 2009 May;12(5):535-40. doi: 10.1038/nn.2303.


A neuroscientific experiment typically generates a large amount of data, of which only a small fraction is analyzed in detail and presented in a publication. However, selection among noisy measurements can render circular an otherwise appropriate analysis and invalidate results. Here we argue that systems neuroscience needs to adjust some widespread practices to avoid the circularity that can arise from selection. In particular, 'double dipping', the use of the same dataset for selection and selective analysis, will give distorted descriptive statistics and invalid statistical inference whenever the results statistics are not inherently independent of the selection criteria under the null hypothesis. To demonstrate the problem, we apply widely used analyses to noise data known to not contain the experimental effects in question. Spurious effects can appear in the context of both univariate activation analysis and multivariate pattern-information analysis. We suggest a policy for avoiding circularity.

Publication types

  • Research Support, N.I.H., Intramural
  • Review

MeSH terms

  • Animals
  • Artifacts
  • Data Collection / methods
  • Data Collection / standards
  • Data Interpretation, Statistical
  • Humans
  • Image Processing, Computer-Assisted / methods
  • Image Processing, Computer-Assisted / standards
  • Magnetic Resonance Imaging / methods
  • Magnetic Resonance Imaging / standards
  • Neurosciences / methods*
  • Reproducibility of Results
  • Selection Bias
  • Systems Biology / methods