Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
, 20 (5), e187

Mapping of Crowdsourcing in Health: Systematic Review

Affiliations

Mapping of Crowdsourcing in Health: Systematic Review

Perrine Créquit et al. J Med Internet Res.

Abstract

Background: Crowdsourcing involves obtaining ideas, needed services, or content by soliciting Web-based contributions from a crowd. The 4 types of crowdsourced tasks (problem solving, data processing, surveillance or monitoring, and surveying) can be applied in the 3 categories of health (promotion, research, and care).

Objective: This study aimed to map the different applications of crowdsourcing in health to assess the fields of health that are using crowdsourcing and the crowdsourced tasks used. We also describe the logistics of crowdsourcing and the characteristics of crowd workers.

Methods: MEDLINE, EMBASE, and ClinicalTrials.gov were searched for available reports from inception to March 30, 2016, with no restriction on language or publication status.

Results: We identified 202 relevant studies that used crowdsourcing, including 9 randomized controlled trials, of which only one had posted results at ClinicalTrials.gov. Crowdsourcing was used in health promotion (91/202, 45.0%), research (73/202, 36.1%), and care (38/202, 18.8%). The 4 most frequent areas of application were public health (67/202, 33.2%), psychiatry (32/202, 15.8%), surgery (22/202, 10.9%), and oncology (14/202, 6.9%). Half of the reports (99/202, 49.0%) referred to data processing, 34.6% (70/202) referred to surveying, 10.4% (21/202) referred to surveillance or monitoring, and 5.9% (12/202) referred to problem-solving. Labor market platforms (eg, Amazon Mechanical Turk) were used in most studies (190/202, 94%). The crowd workers' characteristics were poorly reported, and crowdsourcing logistics were missing from two-thirds of the reports. When reported, the median size of the crowd was 424 (first and third quartiles: 167-802); crowd workers' median age was 34 years (32-36). Crowd workers were mainly recruited nationally, particularly in the United States. For many studies (58.9%, 119/202), previous experience in crowdsourcing was required, and passing a qualification test or training was seldom needed (11.9% of studies; 24/202). For half of the studies, monetary incentives were mentioned, with mainly less than US $1 to perform the task. The time needed to perform the task was mostly less than 10 min (58.9% of studies; 119/202). Data quality validation was used in 54/202 studies (26.7%), mainly by attention check questions or by replicating the task with several crowd workers.

Conclusions: The use of crowdsourcing, which allows access to a large pool of participants as well as saving time in data collection, lowering costs, and speeding up innovations, is increasing in health promotion, research, and care. However, the description of crowdsourcing logistics and crowd workers' characteristics is frequently missing in study reports and needs to be precisely reported to better interpret the study findings and replicate them.

Keywords: crowdsourcing; health; review [publication type].

Conflict of interest statement

Conflicts of Interest: None declared.

Figures

Figure 1
Figure 1
Publication characteristics of included studies. Two-thirds of the studies have been published in one of the 18 medical specialty journals, covering almost all medical fields showing the widespread use of crowdsourcing, and sometimes in a journal with very high relative impact factor.
Figure 2
Figure 2
Examples of crowdsourced tasks according to health category. EEG: electroencephalography.
Figure 3
Figure 3
Mapping of crowdsourcing applications in health. Sankey diagram representing the distribution of medical fields applying crowdsourcing for each of the 4 types of task. Width of links is proportional to the number of studies. Medical specialties: anatomopathology (n=3), cardiology (n=5), dermatology (n=5), endocrinology (n=1), gynecology (n=2), infectiology (n=6), nephrology (n=1), neurology (n=7), pediatrics (n=2), pneumology (n=3), radiology (n=2),and rheumatology (n=2).

Similar articles

See all similar articles

Cited by 14 articles

See all "Cited by" articles

References

    1. Sobel D. Longitude: The True Story of a Lone Genius Who Solved the Greatest Scientific Problem of His Time. New York: Walker Publishing Company; 1995.
    1. Wikipedia. [2018-02-06]. Crowdsourcing https://en.wikipedia.org/wiki/Crowdsourcing .
    1. Deal SB, Lendvay TS, Haque MI, Brand T, Comstock B, Warren J, Alseidi A. Crowd-sourced assessment of technical skills: an opportunity for improvement in the assessment of laparoscopic surgical skills. Am J Surg. 2016 Feb;211(2):398–404. doi: 10.1016/j.amjsurg.2015.09.005. - DOI - PubMed
    1. Khare R, Good BM, Leaman R, Su AI, Lu Z. Crowdsourcing in biomedicine: challenges and opportunities. Brief Bioinform. 2016 Jan;17(1):23–32. doi: 10.1093/bib/bbv021. - DOI - PMC - PubMed
    1. Ranard BL, Ha YP, Meisel ZF, Asch DA, Hill SS, Becker LB, Seymour AK, Merchant RM. Crowdsourcing--harnessing the masses to advance health and medicine, a systematic review. J Gen Intern Med. 2014 Jan;29(1):187–203. doi: 10.1007/s11606-013-2536-8. - DOI - PMC - PubMed

Publication types

LinkOut - more resources

Feedback