EconPapers    
Economics at your fingertips  
 

Cumulative Paired πœ™-Entropy

Ingo Klein and Benedikt Mangold

No 07/2015, FAU Discussion Papers in Economics from Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics

Abstract: A new kind of entropy will be introduced generalizing both the differential entropy and the cumulative (residual) entropy. The generalization is twofold. Firstly, we define the entropy for cumulative distribution functions (cdf) and survivor functions (sf) simultaneously instead of densities, cdf or sf alone. Secondly, we consider a general 'entropy generating function' πœ™ like Burbea & Rao (1982) or Liese & Vajda (1987) in the context of πœ™-divergences. Combining the ideas of a πœ™-entropy and a cumulative entropy gives the new 'cumulative paired πœ™-entropy' (CPEπœ™). With some modifications or simplifications this new entropy has already been discussed in at least four scientific disciplines. In the fuzzy set theory cumulative paired πœ™- entropies were defined for membership functions. A discrete version serves as a measure of dispersion for ordered categorial variables. More recently, uncertainty and reliability theory considered some variants as a measure of information. With only one exception the discussions seem to happen independently of each other. We consider CPEπœ™ only for continuous cdf and show that CPE πœ™ is rather a measure of dispersion than a measure of information. At first, this will be demonstrated by deriving an upper bound which is determined by the standard deviation and by solving the maximum entropy problem under the restriction that the variance is fixed. We cannot only reproduce the central role of the logistic distribution in entropy maximization. We derive Tukey's Lambda distribution as the solution of an entropy maximization problem as well. Secondly, it will be shown explicitly that CPEπœ™ fulfills the axioms of a dispersion measure. The corresponding dispersion functional can easily be estimated by an L-estimator with all its known asymptotical properties. CPEπœ™ are the starting point for several related concepts like mutual πœ™-information, πœ™-correlation and πœ™-regression which generalize Gini correlation and Gini regression. We give a short introduction into all of these related concepts. Also linear rank tests for scale can be developed based on the new entropy. We show that almost all known tests are special cases and introduce some new tests. In the literature Shannon's differential entropy has been calculated for a lot of distributions. The formulas were presented explicitly. We have done the same for CPEπœ™ if the cdf is available in a closed form.

Keywords: πœ™-entropy; differential entropy; absolute mean deviation; cumulative residual entropy; cumulative entropy; measure of dispersion; measure of polarization; generalized maximum entropy principle; Tukey's Ξ» distribution; power logistic distribution; πœ™-dependence; πœ™-regression; L-estimator; linear rank test (search for similar items in EconPapers)
Date: 2015
References: View references in EconPapers View complete reference list from CitEc
Citations:

There are no downloads for this item, see the EconPapers FAQ for hints about obtaining it.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:zbw:iwqwdp:072015

Access Statistics for this paper

More papers in FAU Discussion Papers in Economics from Friedrich-Alexander University Erlangen-Nuremberg, Institute for Economics Contact information at EDIRC.
Bibliographic data for series maintained by ZBW - Leibniz Information Centre for Economics (econstor@zbw-workspace.eu).

 
Page updated 2025-03-20
Handle: RePEc:zbw:iwqwdp:072015