Validation of a scenario-based assessment of critical thinking using an externally validated tool

J Vet Med Educ. 2012 Fall;39(3):276-82. doi: 10.3138/jvme.0112-009R.

Abstract

With medical education transitioning from knowledge-based curricula to competency-based curricula, critical thinking skills have emerged as a major competency. While there are validated external instruments for assessing critical thinking, many educators have created their own custom assessments of critical thinking. However, the face validity of these assessments has not been challenged. The purpose of this study was to compare results from a custom assessment of critical thinking with the results from a validated external instrument of critical thinking. Students from the College of Veterinary Medicine at Western University of Health Sciences were administered a custom assessment of critical thinking (ACT) examination and the externally validated instrument, California Critical Thinking Skills Test (CCTST), in the spring of 2011. Total scores and sub-scores from each exam were analyzed for significant correlations using Pearson correlation coefficients. Significant correlations between ACT Blooms 2 and deductive reasoning and total ACT score and deductive reasoning were demonstrated with correlation coefficients of 0.24 and 0.22, respectively. No other statistically significant correlations were found. The lack of significant correlation between the two examinations illustrates the need in medical education to externally validate internal custom assessments. Ultimately, the development and validation of custom assessments of non-knowledge-based competencies will produce higher quality medical professionals.

Publication types

  • Validation Study

MeSH terms

  • California
  • Curriculum / standards
  • Education, Veterinary / methods*
  • Educational Measurement / methods*
  • Humans
  • Reproducibility of Results
  • Students, Health Occupations / psychology*
  • Thinking*