How to overcome the Jeffreys-Lindleys Paradox for Invariant Bayesian Inference in Regression Models
Frank Kleibergen
No 01-073/4, Tinbergen Institute Discussion Papers from Tinbergen Institute
Abstract:
We obtain invariant expressions for prior probabilities and priors onthe parameters of nested regression models that are induced by aprior on the parameters of an encompassing linear regression model.The invariance is with respect to specifications that satisfy anecessary set of assumptions. Invariant expressions for posteriorprobabilities and posteriors are induced in an identical way by therespective posterior. These posterior probabilities imply a posteriorodds ratio that is robust to the Jeffreys-Lindleys paradox. Thisresults because the prior odds ratio obtained from the induced priorprobabilities corrects the Bayes factor for the plausibility of thecompeting models reflected in the prior. We illustrate the analysis,where we focus on the construction of specifications that satisfy theset of assumptions, with examples of linear restrictions, i.e. alinear regression model, and non-linear restrictions, i.e. acointegration and ARMA(l,l) model, on the parameters of anencompassing linear regression model.
Date: 2001-08-13
References: Add references at CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
https://papers.tinbergen.nl/01073.pdf (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:tin:wpaper:20010073
Access Statistics for this paper
More papers in Tinbergen Institute Discussion Papers from Tinbergen Institute Contact information at EDIRC.
Bibliographic data for series maintained by Tinbergen Office +31 (0)10-4088900 ().