Using a Markov Chain to Construct a Tractable Approximation of an Intractable Probability Distribution
James P. Hobert,
Galin L. Jones and
Christian P. Robert
Scandinavian Journal of Statistics, 2006, vol. 33, issue 1, 37-51
Abstract:
Abstract. Let π denote an intractable probability distribution that we would like to explore. Suppose that we have a positive recurrent, irreducible Markov chain that satisfies a minorization condition and has π as its invariant measure. We provide a method of using simulations from the Markov chain to construct a statistical estimate of π from which it is straightforward to sample. We show that this estimate is ‘strongly consistent’ in the sense that the total variation distance between the estimate and π converges to 0 almost surely as the number of simulations grows. Moreover, we use some recently developed asymptotic results to provide guidance as to how much simulation is necessary. Draws from the estimate can be used to approximate features of π or as intelligent starting values for the original Markov chain. We illustrate our methods with two examples.
Date: 2006
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://doi.org/10.1111/j.1467-9469.2006.00467.x
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bla:scjsta:v:33:y:2006:i:1:p:37-51
Ordering information: This journal article can be ordered from
http://www.blackwell ... bs.asp?ref=0303-6898
Access Statistics for this article
Scandinavian Journal of Statistics is currently edited by ÿrnulf Borgan and Bo Lindqvist
More articles in Scandinavian Journal of Statistics from Danish Society for Theoretical Statistics, Finnish Statistical Society, Norwegian Statistical Association, Swedish Statistical Association
Bibliographic data for series maintained by Wiley Content Delivery ().