Jensen-information generating function and its connections to some well-known information measures
Omid Kharazmi and
Narayanaswamy Balakrishnan
Statistics & Probability Letters, 2021, vol. 170, issue C
Abstract:
In this work, we consider the information generating function measure and develop some new results associated with it. We specifically propose two new divergence measures and show that some of the well-known information divergences such as Jensen–Shannon, Jensen-extropy and Jensen–Taneja divergence measures are all special cases of it. Finally, we also discuss the information generating function for residual lifetime variables.
Keywords: Information generating function; Shannon entropy; Jensen–Shannon entropy; Jensen-extropy; Kullback–Leibler divergence (search for similar items in EconPapers)
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (2)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167715220302984
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:170:y:2021:i:c:s0167715220302984
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01
DOI: 10.1016/j.spl.2020.108995
Access Statistics for this article
Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul
More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().