Context: Evidence-based practice (EBP) is the integration of the best research evidence with patients' values and clinical circumstances in clinical decision making. Teaching of EBP should be evaluated and guided by evidence of its own effectiveness.
Objective: To appraise, summarize, and describe currently available EBP teaching evaluation instruments.
Data sources and study selection: We searched the MEDLINE, EMBASE, CINAHL, HAPI, and ERIC databases; reference lists of retrieved articles; EBP Internet sites; and 8 education journals from 1980 through April 2006. For inclusion, studies had to report an instrument evaluating EBP, contain sufficient description to permit analysis, and present quantitative results of administering the instrument.
Data extraction: Two raters independently abstracted information on the development, format, learner levels, evaluation domains, feasibility, reliability, and validity of the EBP evaluation instruments from each article. We defined 3 levels of instruments based on the type, extent, methods, and results of psychometric testing and suitability for different evaluation purposes.
Data synthesis: Of 347 articles identified, 115 were included, representing 104 unique instruments. The instruments were most commonly administered to medical students and postgraduate trainees and evaluated EBP skills. Among EBP skills, acquiring evidence and appraising evidence were most commonly evaluated, but newer instruments evaluated asking answerable questions and applying evidence to individual patients. Most behavior instruments measured the performance of EBP steps in practice but newer instruments documented the performance of evidence-based clinical maneuvers or patient-level outcomes. At least 1 type of validity evidence was demonstrated for 53% of instruments, but 3 or more types of validity evidence were established for only 10%. High-quality instruments were identified for evaluating the EBP competence of individual trainees, determining the effectiveness of EBP curricula, and assessing EBP behaviors with objective outcome measures.
Conclusions: Instruments with reasonable validity are available for evaluating some domains of EBP and may be targeted to different evaluation needs. Further development and testing is required to evaluate EBP attitudes, behaviors, and more recently articulated EBP skills.