Regenerative Analysis and Steady State Distributions for Markov Chains
Winfried K. Grassmann,
Michael I. Taksar and
Daniel P. Heyman
Additional contact information
Winfried K. Grassmann: University of Saskatchewan, Saskatoon, Saskatchewan
Michael I. Taksar: Florida State University, Tallahassee, Florida
Daniel P. Heyman: Bell Communications Research, Holmdel, New Jersey
Operations Research, 1985, vol. 33, issue 5, 1107-1116
Abstract:
We apply regenerative theory to derive certain relations between steady state probabilities of a Markov chain. These relations are then used to develop a numerical algorithm to find these probabilities. The algorithm is a modification of the Gauss-Jordan method, in which all elements used in numerical computations are nonnegative; as a consequence, the algorithm is numerically stable.
Keywords: 567 regenerative analysis; 692 imbedded Markov process (search for similar items in EconPapers)
Date: 1985
References: Add references at CitEc
Citations: View citations in EconPapers (21)
Downloads: (external link)
http://dx.doi.org/10.1287/opre.33.5.1107 (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:inm:oropre:v:33:y:1985:i:5:p:1107-1116
Access Statistics for this article
More articles in Operations Research from INFORMS Contact information at EDIRC.
Bibliographic data for series maintained by Chris Asher ().