EconPapers    
Economics at your fingertips  
 

A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

Tomohiro Nishiyama

No wa98j, OSF Preprints from Center for Open Science

Abstract: In this paper, we derive a useful lower bound for the Kullback-Leibler divergence (KL-divergence) based on the Hammersley-Chapman-Robbins bound (HCRB). The HCRB states that the variance of an estimator is bounded from below by the Chi-square divergence and the expectation value of the estimator. By using the relation between the KL-divergence and the Chi-square divergence, we show that the lower bound for the KL-divergence which only depends on the expectation value and the variance of a function we choose. This lower bound can also be derived from an information geometric approach. Furthermore, we show that the equality holds for the Bernoulli distributions and show that the inequality converges to the Cram\'{e}r-Rao bound when two distributions are very close. We also describe application examples and examples of numerical calculation.

Date: 2019-06-28
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
https://osf.io/download/5d1693358a17d50018051984/

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:osf:osfxxx:wa98j

DOI: 10.31219/osf.io/wa98j

Access Statistics for this paper

More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().

 
Page updated 2025-03-19
Handle: RePEc:osf:osfxxx:wa98j