Prediction in Riemannian metrics derived from divergence functions
Henryk Gzyl ()
Communications in Statistics - Theory and Methods, 2022, vol. 51, issue 2, 552-568
Abstract:
Divergence functions are interesting and widely used discrepancy measures. Even though they are not true distances we can use them to measure how separated two points are. Curiously enough, when they are applied to random variables they lead to a notion of best predictor that coincides with usual best predictor in Euclidean distance. From a divergence function we can derive a Riemannian metric which leads to a true distance between random variables, and the best predictors in this metric do not coincide with their Euclidean counterparts. It is the purpose of this paper to explicitly determine the best predictors in the derived metric, compare it to the estimators in divergence, and to obtain the sample estimators of the best predictors. Along the way we obtain results that relate approximations in divergence to approximations in the metric derived from it.
Date: 2022
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/03610926.2020.1752384 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:lstaxx:v:51:y:2022:i:2:p:552-568
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/lsta20
DOI: 10.1080/03610926.2020.1752384
Access Statistics for this article
Communications in Statistics - Theory and Methods is currently edited by Debbie Iscoe
More articles in Communications in Statistics - Theory and Methods from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().