Implementation of the mini-CEX to evaluate medical students' clinical skills

Acad Med. 2002 Nov;77(11):1156-7. doi: 10.1097/00001888-200211000-00021.


Objective: The majority of medicine clerkships use faculty and resident summative ratings to assess medical students' clinical skills. Still, many medical students complete training without ever being observed performing a clinical skill. The mini-CEX is method of clinical skills assessment developed by the American Board of Internal Medicine for graduate medical education. The brief, focused encounters are feasible and produce scores with adequate reproducibility if enough observations are made.(1) The mini-CEX has been used in the medicine core clerkship, being performed once to augment feedback by faculty evaluators in the inpatient setting.(2) However, additional study is needed to address at least two feasibility issues if the mini-CEX is to be used as a measurement tool: (1) multiple settings (inpatient and outpatient) and (2) resident-completed evaluations. Our objective was to determine the feasibility of having students receive multiple mini-CEX's in both the inpatient and outpatient settings from resident and faculty evaluators.

Description: We introduced the mini-CEX into our nine-week medicine clerkship (six weeks inpatient and three weeks outpatient) in July 2001. The clerkship uses four inpatient clinical sites and 16 outpatient practices. Inpatient faculty rotate on two-week blocks and residents on four-week blocks. At our clerkship orientation, each student (n = 39) received a booklet of ten adapted mini-CEX forms. In the mini-CEX, students are observed conducting a focused history and physical examination and then receive immediate feedback. Students are rated in seven competencies (interviewing, physical examination, professionalism, clinical judgment, counseling, organization, and overall clinical competence) using a nine-point rating scale (1 = unsatisfactory and 9 = superior). Our students were instructed to collect nine evaluations: three from faculty (one every two weeks), three from residents (one every two weeks), and three from their out-patient attendings (one per week). Students and evaluators were asked to rate their satisfaction with the exercise using a nine-point scale (1 = low and 9 = high). Students were asked to turn in their booklets the day of the exam. Prior to implementation, we reviewed the mini-CEX forms and rationale for use with residents and inpatient faculty. Similar information was mailed to outpatient faculty preceptors.

Discussion: Booklets were received from 32 students. The mean number of evaluations completed per student was 7.3 (range 2-9), for a total of 232 evaluations. Faculty completed 58% of the evaluations; 68% were from the inpatient setting. The observation and feedback took an average of 21 minutes and 8 minutes, respectively. Satisfaction with the exercise was rated by faculty/residents as 7.2 and by students as 6.8. We believe these findings support the feasibility of collecting multiple mini-CEX assessments from both inpatient and outpatient sites using faculty and resident evaluators. The feasibility of collecting multiple assessments is important if the mini-CEX is to be a reproducible assessment of clinical skills. Having established feasibility, we plan to look at the reproducibility and validity of mini-CEX scores in order to determine if it can be used as a formal means of clinical skills assessment. We also plan to evaluate the impact on the quality and specificity of end-of-clerkship summative ratings.

MeSH terms

  • Clinical Competence*
  • Education, Medical, Graduate*
  • Evaluation Studies as Topic
  • Feasibility Studies