EconPapers    
Economics at your fingertips  
 

The Hellinger Bounds on the Kullback-Leibler Divergence and the Bernstein Norm

Tetsuya Kaji

Papers from arXiv.org

Abstract: The Kullback-Leibler divergence, the Kullback-Leibler variation, and the Bernstein "norm" are used to quantify discrepancies among probability distributions in likelihood models such as nonparametric maximum likelihood and nonparametric Bayes. They are closely related to the Hellinger distance, which is often easier to work with. Consequently, it is of interest to characterize conditions under which the Hellinger distance serves as an upper bound for these measures. This article characterizes a necessary and sufficient condition for each of the discrepancy measures to be bounded by the Hellinger distance. It accommodates unbounded likelihood ratios and generalizes all previously known results. We then apply it to relax the regularity condition for the sieve maximum likelihood estimator.

Date: 2026-01
References: Add references at CitEc
Citations:

Downloads: (external link)
http://arxiv.org/pdf/2601.17860 Latest version (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:arx:papers:2601.17860

Access Statistics for this paper

More papers in Papers from arXiv.org
Bibliographic data for series maintained by arXiv administrators ().

 
Page updated 2026-01-27
Handle: RePEc:arx:papers:2601.17860