EconPapers    
Economics at your fingertips  
 

Monotonically Decreasing Sequence of Divergences

Tomohiro Nishiyama

No wr2s6, OSF Preprints from Center for Open Science

Abstract: Divergences are quantities that measure discrepancy between two probability distributions and play an important role in various fields such as statistics and machine learning. Divergences are non-negative and are equal to zero if and only if two distributions are the same. In addition, some important divergences such as the f-divergence have convexity, which we call ``convex divergence''. In this paper, we show new properties of the convex divergences by using integral and differential operators that we introduce. For the convex divergence, the result applied the integral or differential operator is also a divergence. In particular, the integral operator preserves convexity. Furthermore, the results applied the integral operator multiple times constitute a monotonically decreasing sequence of the convex divergences. We derive new sequences of the convex divergences that include the Kullback-Leibler divergence or the reverse Kullback-Leibler divergence from these properties.

Date: 2019-10-18
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)

Downloads: (external link)
https://osf.io/download/5dab0ef7f1b0a9000c6472c0/

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:osf:osfxxx:wr2s6

DOI: 10.31219/osf.io/wr2s6

Access Statistics for this paper

More papers in OSF Preprints from Center for Open Science
Bibliographic data for series maintained by OSF ().

 
Page updated 2025-03-19
Handle: RePEc:osf:osfxxx:wr2s6