On Sequential Decisions and Markov Chains
Cyrus Derman
Additional contact information
Cyrus Derman: Columbia University and Technion, Israel Institute of Technology
Management Science, 1962, vol. 9, issue 1, 16-24
Abstract:
Several problems in the optimal control of dynamic systems are considered. When observed, a system is classifiable into one of a finite number of states and controlled by making one of a finite number of decisions. The sequence of observed states is a stochastic process dependent upon the sequence of decisions, in that the decisions determine the probability laws that operate on the system. Costs are associated with the sequence of states and decisions. It is shown that, for the problems considered, the optimal rules for controlling the system belong to a subclass of all possible rules and, within this subclass, the optimal rules can be derived by solving linear programming problems.
Date: 1962
References: Add references at CitEc
Citations: View citations in EconPapers (26)
Downloads: (external link)
http://dx.doi.org/10.1287/mnsc.9.1.16 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:ormnsc:v:9:y:1962:i:1:p:16-24
Access Statistics for this article
More articles in Management Science from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().