Tsallis eXtropy
Yige Xue and
Yong Deng
Communications in Statistics - Theory and Methods, 2023, vol. 52, issue 3, 751-762
Abstract:
The Tsallis entropy has high performance in handling uncertain information, which is the extension of information entropy. The extropy is a complementary dual of entropy, which is a research hot spot. This paper proposed the Tsallis eXtropy and maximum Tsallis eXtropy, which are the complementary of Tsallis entropy and maximum Tsallis entropy respectively. When the non-extensive parameter is equal to 1, then the Tsallis eXtropy and the maximum Tsallis eXtropy will degenerate into information extropy and maximum information extropy respectively. Some meaningful theorems and proofs of the Tsallis eXtropy and the maximum Tsallis eXtropy has been proposed. Numerical examples have been designed to prove the effectiveness of the proposed models in evaluating uncertainties. Comparison application has been applied to prove the superiority of the proposed models than other models. The experimental results have shown that the proposed models are very high effective in measuring unknown issues.
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/03610926.2021.1921804 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:lstaxx:v:52:y:2023:i:3:p:751-762
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/lsta20
DOI: 10.1080/03610926.2021.1921804
Access Statistics for this article
Communications in Statistics - Theory and Methods is currently edited by Debbie Iscoe
More articles in Communications in Statistics - Theory and Methods from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().