Inference in the age of big data: Future perspectives on neuroscience

Neuroimage. 2017 Jul 15:155:549-564. doi: 10.1016/j.neuroimage.2017.04.061. Epub 2017 Apr 27.

Abstract

Neuroscience is undergoing faster changes than ever before. Over 100 years our field qualitatively described and invasively manipulated single or few organisms to gain anatomical, physiological, and pharmacological insights. In the last 10 years neuroscience spawned quantitative datasets of unprecedented breadth (e.g., microanatomy, synaptic connections, and optogenetic brain-behavior assays) and size (e.g., cognition, brain imaging, and genetics). While growing data availability and information granularity have been amply discussed, we direct attention to a less explored question: How will the unprecedented data richness shape data analysis practices? Statistical reasoning is becoming more important to distill neurobiological knowledge from healthy and pathological brain measurements. We argue that large-scale data analysis will use more statistical models that are non-parametric, generative, and mixing frequentist and Bayesian aspects, while supplementing classical hypothesis testing with out-of-sample predictions.

Keywords: Epistemology; High-dimensional statistics; Hypothesis testing; Machine learning; Sample complexity; Systems biology.

Publication types

  • Review
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Data Interpretation, Statistical*
  • Datasets as Topic / trends*
  • Humans
  • Models, Theoretical*
  • Neurosciences / trends*