Average Entropy: A New Uncertainty Measure with Application to Image Segmentation
Omar A. Kittaneh,
Mohammad A. U. Khan,
Muhammed Akbar and
Husam A. Bayoud
The American Statistician, 2016, vol. 70, issue 1, 18-24
Abstract:
Various modifications have been suggested in the past to extend Shannon entropy to continuous random variables. This article investigates these modifications, and suggests a new entropy measure with the name of average entropy (AE). AE is more general than Shannon entropy in the sense that its definition encompasses both continuous as well as discrete domains. It is additive, positive and attains zero only when the distribution is uniform. The main characteristic of the suggested measure lies in its consistency behavior. Many properties of AE, including its relationship with Kullback--Leibler information measure, are studied. Precise theorems about the vanishing of the conditional AE for both continuous and discrete distributions are provided. Toward the end, the measure is tested for its effectiveness in image segmentation.[Received March 2014. Revised June 2015.]
Date: 2016
References: View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
http://hdl.handle.net/10.1080/00031305.2015.1089788 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:taf:amstat:v:70:y:2016:i:1:p:18-24
Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/UTAS20
DOI: 10.1080/00031305.2015.1089788
Access Statistics for this article
The American Statistician is currently edited by Eric Sampson
More articles in The American Statistician from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().