Renyi extropy
Jiali Liu and
Fuyuan Xiao
Communications in Statistics - Theory and Methods, 2023, vol. 52, issue 16, 5836-5847
Abstract:
Renyi entropy is a generalization of Shannon entropy, which plays an important role in information theory. Recently, a new concept called extropy has been developed, which is the dual complement of entropy. This paper proposes Renyi extropy, maximum Renyi extropy and conditional Renyi extropy. When the parameter q of Renyi extropy tends to 1, Renyi extropy degenerates to extropy. When the probability is uniformly distributed, Renyi extropy takes the maximum value, and the maximum Renyi extropy is equal to the maximum extropy.
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
http://hdl.handle.net/10.1080/03610926.2021.2020843 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:lstaxx:v:52:y:2023:i:16:p:5836-5847
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/lsta20
DOI: 10.1080/03610926.2021.2020843
Access Statistics for this article
Communications in Statistics - Theory and Methods is currently edited by Debbie Iscoe
More articles in Communications in Statistics - Theory and Methods from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().