Mutual Information and Nonadditive Entropies: The Case of Tsallis Entropy
Amelia Carolina Sparavigna
International Journal of Sciences, 2015, vol. 4, issue 10, 1-4
Abstract:
Mutual information of two random variables can be easily obtained from their Shannon entropies. However, when nonadditive entropies are involved, the calculus of the mutual information is more complex. Here we discuss the basic matter about information from Shannon entropy. Then we analyse the case of the generalized nonadditive Tsallis entropy.
Keywords: Mutual Information; Entropy; Tsallis Entropy; Generalized Additivity; Image Registration (search for similar items in EconPapers)
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.ijsciences.com/pub/article/845 (text/html)
https://www.ijsciences.com/pub/pdf/V4201510845.pdf (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:adm:journl:v:4:y:2015:i:10:p:1-4
Ordering information: This journal article can be ordered from
https://www.ijsciences.com/payment_guide.php
DOI: 10.18483/ijSci.845
Access Statistics for this article
More articles in International Journal of Sciences from Office ijSciences Alkhaer Publications Manchester M8 8XG England.
Bibliographic data for series maintained by Staff ijSciences ().