Validation of a multi-source feedback tool for use in general practice

Educ Prim Care. 2010 May;21(3):165-79. doi: 10.1080/14739879.2010.11493902.

Abstract

Feedback from colleagues and patients is a core element of the revalidation process being developed by the General Medical Council. However, there are few feedback tools which have been specifically developed and validated for doctors in primary care. This paper presents data demonstrating the reliability and validity of one such tool. The CFEP360 tool combines feedback from the Colleague Feedback Evaluation Tool (CFET) and the Doctor's Interpersonal Skills Questionnaire (DISQ). The analysis of over 10 000 completed questionnaires presented here identifies that colleague feedback is essentially two-dimensional (i.e. clinical and non-clinical skills) and that patient feedback is one-dimensional. However, items from both scales also effectively predict combined global ratings, indicating that colleagues and patients are identifying similar levels of performance as accessed by the feedback. Doctors who receive low feedback scores may require further attention, meaning the feedback potentially has diagnostic value. Reliable feedback on this tool, as indicated by this analysis, requires 14 colleague responses and 25 patient responses, figures comparable to other MSF tools if CFEP360 is to be used for a high stakes performance evaluation and possible revalidation (generalisability statistic G> or =0.80). For lower stakes performance evaluations, such as personal development, responses from 11 colleagues and 16 patients will still return reliable results (G> or =0.70).

MeSH terms

  • Clinical Competence
  • Communication
  • Cooperative Behavior
  • Employee Performance Appraisal / methods*
  • Feedback*
  • Humans
  • Physicians, Family*
  • Reproducibility of Results
  • Surveys and Questionnaires*