Direct observation has demonstrated considerable power in the reliable and valid measurement of human behavior. A variety of direct methodologies have been applied to physician-patient interactions to answer different types of questions. This study describes the development and evaluation of a 20-item direct observation scale for physician-patient interactions, the Davis Observation Code (DOC). The study compared the rates of occurrence of four key physician behaviors measurable by both direct observation and chart audit: disease prevention, health education, health promotion, and compliance checking. Forty-nine videotaped physician-patient interactions were independently analyzed using the DOC. The medical record of each videotaped encounter was also reviewed. Reliability determined by inter-rater agreement regarding the presence/absence of each was acceptable for both direct observation and chart audit. Rates of occurrence of each target behavior differed between the two methods of review; chart audit consistently yielded lower rates. Nonparametric correlation analyses yielded phi values ranging from .12 to .49, suggesting low concurrent validity. Most of the discordance between the results of the videotaped observation and chart audit involved underreporting in the chart of observed behavior by the physician. Implications of the findings for health care delivery research are discussed.