Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 126, 103552

Rigor and Reproducibility for Data Analysis and Design in the Behavioral Sciences

Affiliations

Rigor and Reproducibility for Data Analysis and Design in the Behavioral Sciences

Tom Hildebrandt et al. Behav Res Ther.

Abstract

The rigor and reproducibility of science methods depends heavily on the appropriate use of statistical methods to answer research questions and make meaningful and accurate inferences based on data. The increasing analytic complexity and valuation of novel statistical and methodological approaches to data place greater emphasis on statistical review. We will outline the controversies within statistical sciences that threaten rigor and reproducibility of research published in the behavioral sciences and discuss ongoing approaches to generate reliable and valid inferences from data. We outline nine major areas to consider for generally evaluating the rigor and reproducibility of published articles and apply this framework to the 116 Behaviour Research and Therapy (BRAT) articles published in 2018. The results of our analysis highlight a pattern of missing rigor and reproducibility elements, especially pre-registration of study hypotheses, links to statistical code/output, and explicit archiving or sharing data used in analyses. We recommend reviewers consider these elements in their peer review and that journals consider publishing results of these rigor and reproducibility ratings with manuscripts to incentivize authors to publish these elements with their manuscript.

Keywords: Big data; P-hacking; Reliability; Reproducibility; Statistics.

Conflict of interest statement

Declaration of competing interest The authors declare no conflict of interest.

Similar articles

See all similar articles

LinkOut - more resources

Feedback