EconPapers    
Economics at your fingertips  
 

A general definition of conditional information and its application to ergodic decomposition

Lukasz Debowski

Statistics & Probability Letters, 2009, vol. 79, issue 9, 1260-1268

Abstract: We discuss a simple definition of conditional mutual information (CMI) for fields and [sigma]-fields. The new definition is applicable also in nonregular cases, unlike the well-known but more restricted definition of CMI by Dobrushin. Certain properties of the two notions of CMI and their equivalence for countably generated [sigma]-fields are established. We also consider an application, which concerns the ergodic decomposition of mutual information for stationary processes. In this case, CMI is tightly linked, via additivity of information, with entropy defined as self-information. Thus we reconsider the latter concept in some detail.

Date: 2009
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0167-7152(09)00042-X
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:stapro:v:79:y:2009:i:9:p:1260-1268

Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/supportfaq.cws_home/regional
https://shop.elsevie ... _01_ooc_1&version=01

Access Statistics for this article

Statistics & Probability Letters is currently edited by Somnath Datta and Hira L. Koul

More articles in Statistics & Probability Letters from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-03-19
Handle: RePEc:eee:stapro:v:79:y:2009:i:9:p:1260-1268