Markov Models in Medical Decision Making
Frank A. Sonnenberg and
J. Robert Beck
Medical Decision Making, 1993, vol. 13, issue 4, 322-338
Abstract:
Markov models are useful when a decision problem involves risk that is continuous over time, when the timing of events is important, and when important events may happen more than once. Representing such clinical settings with conventional decision trees is difficult and may require unrealistic simplifying assumptions. Markov models assume that a patient is always in one of a finite number of discrete health states, called Markov states. All events are represented as transitions from one state to another. A Markov model may be evaluated by matrix algebra, as a cohort simulation, or as a Monte Carlo simulation. A newer repre sentation of Markov models, the Markov-cycle tree, uses a tree representation of clinical events and may be evaluated either as a cohort simulation or as a Monte Carlo simulation. The ability of the Markov model to represent repetitive events and the time dependence of both probabilities and utilities allows for more accurate representation of clinical settings that involve these issues. Key words: Markov models; Markov-cycle decision tree; decision mak ing. (Med Decis Making 1993;13:322-338)
Date: 1993
References: View complete reference list from CitEc
Citations: View citations in EconPapers (97)
Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0272989X9301300409 (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:sae:medema:v:13:y:1993:i:4:p:322-338
DOI: 10.1177/0272989X9301300409
Access Statistics for this article
More articles in Medical Decision Making
Bibliographic data for series maintained by SAGE Publications ().