Written case simulations are often used to investigate physicians' decision making and clinical competence. Their use rests on the assumption that physicians' responses to written simulations closely agree with their responses to actual clinical encounters, yet this assumption of criterion validity has received little attention. To determine the ability of written case simulations to predict actual clinical behavior, we applied methodologic criteria to published articles that used written simulations. Only 11 (15%) of 74 articles included an assessment of the criterion validity of their written case simulations. Only 2 of those 11 studies were designed and executed in such a way that criterion validity could be fully interpreted. No clear consensus emerged from an examination on the 11 studies on how well responses to written case simulations perform as proxy measures of actual behavior. More work is needed before assuming that written case simulations measure actual behavior.