Living systematic reviews: 2. Combining human and machine effort

J Clin Epidemiol. 2017 Nov:91:31-37. doi: 10.1016/j.jclinepi.2017.08.011. Epub 2017 Sep 11.

Abstract

New approaches to evidence synthesis, which use human effort and machine automation in mutually reinforcing ways, can enhance the feasibility and sustainability of living systematic reviews. Human effort is a scarce and valuable resource, required when automation is impossible or undesirable, and includes contributions from online communities ("crowds") as well as more conventional contributions from review authors and information specialists. Automation can assist with some systematic review tasks, including searching, eligibility assessment, identification and retrieval of full-text reports, extraction of data, and risk of bias assessment. Workflows can be developed in which human effort and machine automation can each enable the other to operate in more effective and efficient ways, offering substantial enhancement to the productivity of systematic reviews. This paper describes and discusses the potential-and limitations-of new ways of undertaking specific tasks in living systematic reviews, identifying areas where these human/machine "technologies" are already in use, and where further research and development is needed. While the context is living systematic reviews, many of these enabling technologies apply equally to standard approaches to systematic reviewing.

Keywords: Automation; Citizen science; Crowdsourcing; Machine learning; Systematic review; Text mining.

MeSH terms

  • Data Mining / methods*
  • Evidence-Based Medicine
  • Humans
  • Machine Learning*
  • Review Literature as Topic*