EconPapers    
Economics at your fingertips  
 

Relative Entropy as a Measure of Diagnostic Information

William A. Benish

Medical Decision Making, 1999, vol. 19, issue 2, 202-206

Abstract: Relative entropy is a concept within information theory that provides a measure of the distance between two probability distributions. The author proposes that the amount of information gained by performing a diagnostic test can be quantified by calculating the relative entropy between the posttest and pretest probability distributions. This statistic, in essence, quantifies the degree to which the results of a diagnostic test are likely to reduce our surprise upon ultimately learning a patient's diagnosis. A previously proposed measure of diagnostic information that is also based on information theory (pretest entropy minus posttest entropy) has been criticized as failing, in some cases, to agree with our intuitive concept of diagnostic information. The proposed formula passes the tests used to challenge this previous measure. Key words: diagnostic test; entropy; information theory; relative entropy; uncertainty. (Med Decis Making 1999;19: 202-206)

Date: 1999
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0272989X9901900211 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:medema:v:19:y:1999:i:2:p:202-206

DOI: 10.1177/0272989X9901900211

Access Statistics for this article

More articles in Medical Decision Making
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:medema:v:19:y:1999:i:2:p:202-206