EconPapers    
Economics at your fingertips  
 

Markov Decision Processes: A Tool for Sequential Decision Making under Uncertainty

Oguzhan Alagoz, Heather Hsu, Andrew J. Schaefer and Mark S. Roberts
Additional contact information
Oguzhan Alagoz: Department of Industrial and Systems Engineering, University of Wisconsin-Madison, Madison, WI, alagoz@engr.wisc.edu
Heather Hsu: Department of Industrial Engineering, University of Pittsburgh, Pittsburgh, PA, Section of Decision Sciences and Clinical Systems Modeling, Division of General Medicine, and Department of Medicine, University of Pittsburgh School of Medicine, Pittsburgh, PA
Andrew J. Schaefer: Department of Industrial Engineering, University of Pittsburgh, Pittsburgh, PA
Mark S. Roberts: Department of Health Policy and Management, University of Pittsburgh Graduate School of Public Health, Pittsburgh, PA, Section of Decision Sciences and Clinical Systems Modeling, Division of General Medicine, and Department of Medicine, University of Pittsburgh School of Medicine, Pittsburgh, PA

Medical Decision Making, 2010, vol. 30, issue 4, 474-483

Abstract: We provide a tutorial on the construction and evaluation of Markov decision processes (MDPs), which are powerful analytical tools used for sequential decision making under uncertainty that have been widely used in many industrial and manufacturing applications but are underutilized in medical decision making (MDM). We demonstrate the use of an MDP to solve a sequential clinical treatment problem under uncertainty. Markov decision processes generalize standard Markov models in that a decision process is embedded in the model and multiple decisions are made over time. Furthermore, they have significant advantages over standard decision analysis. We compare MDPs to standard Markov-based simulation models by solving the problem of the optimal timing of living-donor liver transplantation using both methods. Both models result in the same optimal transplantation policy and the same total life expectancies for the same patient and living donor. The computation time for solving the MDP model is significantly smaller than that for solving the Markov model. We briefly describe the growing literature of MDPs applied to medical decisions.

Keywords: Markov decision processes; decision analysis; Markov processes. (search for similar items in EconPapers)
Date: 2010
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (13)

Downloads: (external link)
https://journals.sagepub.com/doi/10.1177/0272989X09353194 (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:sae:medema:v:30:y:2010:i:4:p:474-483

DOI: 10.1177/0272989X09353194

Access Statistics for this article

More articles in Medical Decision Making
Bibliographic data for series maintained by SAGE Publications ().

 
Page updated 2025-03-19
Handle: RePEc:sae:medema:v:30:y:2010:i:4:p:474-483