Generalized divergences from generalized entropies
Leonardo E. Riveaud,
Diego Mateos,
Steeve Zozor and
Pedro W. Lamberti
Physica A: Statistical Mechanics and its Applications, 2018, vol. 510, issue C, 68-76
Abstract:
Several quantifiers of information, also known as entropies, have been introduced in different contexts and from different motivations. For almost each one of these entropies, a measure of the loss (or gain) of information has been introduced. In this work we introduce generalized weighted divergences associated with an arbitrary entropy. The resulting measures are closely related to Bregman divergences. We study the main formal properties of the resulting divergences, we extend them to weighted probability distributions and we apply some of them to the analysis of simulated and real time series.
Keywords: Divergences; Information theory; Time series (search for similar items in EconPapers)
Date: 2018
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0378437118308355
Full text for ScienceDirect subscribers only. Journal offers the option of making the article available online on Science direct for a fee of $3,000
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:phsmap:v:510:y:2018:i:c:p:68-76
DOI: 10.1016/j.physa.2018.06.111
Access Statistics for this article
Physica A: Statistical Mechanics and its Applications is currently edited by K. A. Dawson, J. O. Indekeu, H.E. Stanley and C. Tsallis
More articles in Physica A: Statistical Mechanics and its Applications from Elsevier
Bibliographic data for series maintained by Catherine Liu ().