Information Geometry, Bayesian Inference, Ideal Estimates and Error Decomposition
Huaiyu Zhu and
Richard Rohwer
Working Papers from Santa Fe Institute
Abstract:
In statistics it is necessary to study the relation among many probability distributions. Information geometry elucidates the geometric structure on the space of all distributions. When combined with Bayesian decision theory, it leads to the new concept of "ideal estimates." They uniquely exist in the space of finite measures, and are generally sufficient statistic. The optimal estimate on any model is given by projecting the ideal estimate onto that model. An error decomposition theorem splits the error of an estimate into the sum of statistical error and approximation error. They can be expanded to yield higher order asymptotics. Furthermore, the ideal estimates under certain uniform priors, invariantly defined in information geometry, corresponds to various optimal non-Bayesian estimates, such as the MLE.
Keywords: Bayesian inference; ideal estimate; information geometry; error decomposition; nonparametric estimation (search for similar items in EconPapers)
Date: 1998-06
References: View references in EconPapers View complete reference list from CitEc
Citations:
There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:wop:safiwp:98-06-045
Access Statistics for this paper
More papers in Working Papers from Santa Fe Institute Contact information at EDIRC.
Bibliographic data for series maintained by Thomas Krichel ().