Background: Quality improvement programs that allow physicians to document medical reasons for deviating from guidelines preserve clinicians' judgment while enabling them to strive for high performance. However, physician misconceptions or gaming potentially limit programs.
Objective: To implement computerized decision support with mechanisms to document medical exceptions to quality measures and to perform peer review of exceptions and provide feedback when appropriate.
Design: Observational study.
Setting: Large internal medicine practice.
Participants: Patients eligible for 1 or more quality measures.
Measurements: A peer-review panel judged medical exceptions to 16 chronic disease and prevention quality measures as appropriate, inappropriate, or of uncertain appropriateness. Medical records were reviewed after feedback was given to determine whether care changed.
Results: Physicians recorded 650 standardized medical exceptions during 7 months. The reporting tool was used without any medical reason 36 times (5.5%). Of the remaining 614 exceptions, 93.6% were medically appropriate, 3.1% were inappropriate, and 3.3% were of uncertain appropriateness. Frequencies of inappropriate exceptions were 7 (6.9%) for coronary heart disease, 0 (0%) for heart failure, 10 (10.8%) for diabetes, and 2 (0.6%) for preventive services. After physicians received direct feedback about inappropriate exceptions, 8 of 19 (42%) changed management. The peer-review process took less than 5 minutes per case, but for each change in clinical care, 65 reviews were required.
Limitation: The findings could differ at other sites or if financial incentives were in place.
Conclusion: Physician-recorded medical exceptions were correct most of the time. Peer review of medical exceptions can identify myths and misconceptions, but the process needs to be more efficient to be sustainable.
Primary funding source: Agency for Healthcare Research and Quality.