Relative entropy as a measure of diagnostic information

Med Decis Making. 1999 Apr-Jun;19(2):202-6. doi: 10.1177/0272989X9901900211.

Abstract

Relative entropy is a concept within information theory that provides a measure of the distance between two probability distributions. The author proposes that the amount of information gained by performing a diagnostic test can be quantified by calculating the relative entropy between the posttest and pretest probability distributions. This statistic, in essence, quantifies the degree to which the results of a diagnostic test are likely to reduce our surprise upon ultimately learning a patient's diagnosis. A previously proposed measure of diagnostic information that is also based on information theory (pretest entropy minus posttest entropy) has been criticized as failing, in some cases, to agree with our intuitive concept of diagnostic information. The proposed formula passes the tests used to challenge this previous measure.

MeSH terms

  • Clinical Medicine
  • Diagnosis*
  • Humans
  • Information Theory*
  • Models, Statistical*
  • Nonlinear Dynamics*
  • Probability*
  • Reproducibility of Results
  • Sensitivity and Specificity
  • Statistical Distributions*