Objective: To answer the question, "does CME work?" by reviewing the effectiveness of continuing medical education (CME) and other related educational methods on objectively-determined physician performance and/or health care outcomes. These interventions include educational materials, formal, planned CME activities or programs, outreach visits such as academic detailing, opinion leaders, patient-mediated strategies, audit and feedback, reminders, or a combination of these strategies.
Methods: MEDLINE, ERIC, NTIS, the Research and Development Resource Base in CME and other relevant data sources including review articles were searched for relevant terms, from 1975 to 1994. Of those articles retrieved, randomized controlled trials of educational strategies or interventions which objectively assessed physician performance and/or health care outcomes were selected for review. Data were extracted from each article about the specialty of the physician targeted, the clinical subject of the intervention, the setting and the nature of the educational method, and the presence or degree of needs assessment or barriers to change.
Results: More than two-thirds of the studies (70%) displayed a change in physician performance, while almost half (48%) of interventions produced a change in health care outcomes. Community-based strategies such as academic detailing (and to a lesser extent, opinion leaders), practice-based methods such as reminders and patient-mediated strategies, and multiple interventions appeared to be most effective activities. Mixed results and weaker outcomes were demonstrated by audit and educational materials, while formal CME conferences without enabling or practice-reinforcing strategies, had relatively little impact.
Conclusion: Strategies which enable and/or reinforce appear to "work" in changing physician performance or health care outcomes, a finding which has significant impact on the delivery of CME, and the need for further research into physician learning and change.