Definition of a prior distribution in Bayesian analysis by minimizing Kullback–Leibler divergence under data availability
Lev Slutskin
Applied Econometrics, 2015, vol. 40, issue 4, 129-141
Abstract:
A formal rule for selection of a prior probability distribution based on minimization of the Kullback–Leibler divergence, when data obtained from previous observations are available, is suggested. Contrary to a usual requirement in empirical Bayesian analysis, parameters for different observations are not assumed to be independent. In the case when both observations and parameters are normal, the procedure is equivalent to the ML–II approach. However regression coefficients obtained by minimization of the Kullback–Leibler divergence are different from the ML–II estimates. Finally, it is shown that in the case of normal distributions Kullback–Leibler divergence achieves asymptotically its only minimum at the true prior distribution
Keywords: prior probability distributions; Bayesian methodology; Kullback–Leibler divergence; regression analysis (search for similar items in EconPapers)
JEL-codes: C11 (search for similar items in EconPapers)
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://pe.cemi.rssi.ru/pe_2015_4_129-141.pdf Full text (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ris:apltrx:0281
Access Statistics for this article
Applied Econometrics is currently edited by Anatoly Peresetsky
More articles in Applied Econometrics from Russian Presidential Academy of National Economy and Public Administration (RANEPA)
Bibliographic data for series maintained by Anatoly Peresetsky ().