Converting information into probability measures with the Kullback–Leibler divergence
Pier Bissiri () and
Stephen Walker ()
Annals of the Institute of Statistical Mathematics, 2012, vol. 64, issue 6, 1139-1160
Abstract:
This paper uses a decision theoretic approach for updating a probability measure representing beliefs about an unknown parameter. A cumulative loss function is considered, which is the sum of two terms: one depends on the prior belief and the other one on further information obtained about the parameter. Such information is thus converted to a probability measure and the key to this process is shown to be the Kullback–Leibler divergence. The Bayesian approach can be derived as a natural special case. Some illustrations are presented. Copyright The Institute of Statistical Mathematics, Tokyo 2012
Keywords: Bayesian inference; Posterior distribution; Loss function; Kullback–Leibler divergence; g-divergence (search for similar items in EconPapers)
Date: 2012
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1007/s10463-012-0350-4 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:spr:aistmt:v:64:y:2012:i:6:p:1139-1160
Ordering information: This journal article can be ordered from
http://www.springer. ... cs/journal/10463/PS2
DOI: 10.1007/s10463-012-0350-4
Access Statistics for this article
Annals of the Institute of Statistical Mathematics is currently edited by Tomoyuki Higuchi
More articles in Annals of the Institute of Statistical Mathematics from Springer, The Institute of Statistical Mathematics
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().