Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 13 (10), e1002264
eCollection

Meta-research: Evaluation and Improvement of Research Methods and Practices

Affiliations

Meta-research: Evaluation and Improvement of Research Methods and Practices

John P A Ioannidis et al. PLoS Biol.

Abstract

As the scientific enterprise has grown in size and diversity, we need empirical evidence on the research process to test and apply interventions that make it more efficient and its results more reliable. Meta-research is an evolving scientific discipline that aims to evaluate and improve research practices. It includes thematic areas of methods, reporting, reproducibility, evaluation, and incentives (how to do, report, verify, correct, and reward science). Much work is already done in this growing field, but efforts to-date are fragmented. We provide a map of ongoing efforts and discuss plans for connecting the multiple meta-research efforts across science worldwide.

Conflict of interest statement

The authors have declared that no competing interests exist.

Figures

Fig 1
Fig 1. Number of meta-research–related publications registered by the Scopus database between January 1 and May 16 2015, by country of corresponding author and by affiliation of any coauthor.
Countries are attributed based on corresponding or first author address (legend, from light yellow to red, respectively, to 1–5, 5–10, 10–20, 20–50, 50–230 publications). Blue dots indicate the 100 institutions most frequently listed amongst coauthors’ addresses. Dot size is proportional to number of papers (range: 2–37). Papers were selected for inclusion from an initial list of 1,422 papers retrieved from the Scopus database using a combination of search terms aimed at capturing the core areas described in Table 1. Of the 851 records selected for inclusion, country or affiliation data could not be retrieved for 102 Scopus records, which therefore are not included in the map. Search terms, literature lists, and further details are available at metrics.stanford.edu. The map and plots therein were generated anew, using the packages ggmap and ggplot2 implemented in the open source statistical software R. Image Credit: Daniele Fanelli

Similar articles

See all similar articles

Cited by 44 PubMed Central articles

See all "Cited by" articles

References

    1. Van Noorden R, Maher B, Nuzzo R (2014) The top 100 papers. Nature. 514(7524):550–3. 10.1038/514550a - DOI - PubMed
    1. Ioannidis JP, Boyack KW, Klavans R (2014) Estimates of the continuously publishing core in the scientific workforce. PLoS One. 9(7):e101698 10.1371/journal.pone.0101698 - DOI - PMC - PubMed
    1. Macleod MR, Michie S, Roberts I, Dirnagl U, Chalmers I, Ioannidis JP, Al-Shahi Salman R, Chan AW, Glasziou P (2014) Biomedical research: increasing value, reducing waste. Lancet. 2014. January 11;383(9912):101–4. 10.1016/S0140-6736(13)62329-6 - DOI - PubMed
    1. Begley CG, Ioannidis JP (2015) Reproducibility in science: improving the standard for basic and preclinical research. Circ Res. 116(1):116–26. 10.1161/CIRCRESAHA.114.303819 - DOI - PubMed
    1. Begley CG, Ellis LM (2012) Drug development: Raise standards for preclinical cancer research. Nature. 483(7391):531–3. 10.1038/483531a - DOI - PubMed

Publication types

Grant support

The authors received no specific funding for this work. METRICS is funded by a grant by the Laura and John Arnold Foundation.
Feedback