The Mean Relative Entropy: An Invariant Measure of Estimation Error
Jin Zhang
The American Statistician, 2021, vol. 75, issue 2, 117-123
Abstract:
A fundamental issue in statistics is parameter estimation, where the first step is to select estimators under some measure of estimation error. The commonly used measure is the mean squared error, which is simple, intuitive and highly interpretable, but it has some drawbacks, often creating confusions in evaluating estimators. To solve these problems, we propose two invariance properties and the sufficiency principle as the prerequisite for any reasonable measure. Then, the mean relative entropy is established as an invariant measure of estimation error.
Date: 2021
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/00031305.2018.1543139 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:amstat:v:75:y:2021:i:2:p:117-123
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/UTAS20
DOI: 10.1080/00031305.2018.1543139
Access Statistics for this article
The American Statistician is currently edited by Eric Sampson
More articles in The American Statistician from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().