Bayesian Multimodel Inference by RJMCMC: A Gibbs Sampling Approach
Richard J. Barker and
William A. Link
The American Statistician, 2013, vol. 67, issue 3, 150-156
Abstract:
Bayesian multimodel inference treats a set of candidate models as the sample space of a latent categorical random variable, sampled once; the data at hand are modeled as having been generated according to the sampled model. Model selection and model averaging are based on the posterior probabilities for the model set. Reversible-jump Markov chain Monte Carlo (RJMCMC) extends ordinary MCMC methods to this meta-model. We describe a version of RJMCMC that intuitively represents the process as Gibbs sampling with alternating updates of a categorical variable M (for Model ) and a "palette" of parameters , from which any of the model-specific parameters can be calculated. Our representation makes plain how model-specific Monte Carlo outputs (analytical or numerical) can be post-processed to compute model weights or Bayes factors. We illustrate the procedure with several examples.
Date: 2013
References: View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://hdl.handle.net/10.1080/00031305.2013.791644 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:amstat:v:67:y:2013:i:3:p:150-156
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/UTAS20
DOI: 10.1080/00031305.2013.791644
Access Statistics for this article
The American Statistician is currently edited by Eric Sampson
More articles in The American Statistician from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().