EconPapers    
Economics at your fingertips  
 

An extension of entropy power inequality for dependent random variables

Fatemeh Asgari and Mohammad Hossein Alamatsaz

Communications in Statistics - Theory and Methods, 2022, vol. 51, issue 13, 4358-4369

Abstract: The entropy power inequality (EPI) for convolution of two independent random variables was first proposed by Shannon, C. E., 1948. However, in practice, there are many situations in which the involved random variables are not independent. In this article, considering additive noise channels, it is shown that, under some conditions, EPI holds for the case when the involved random variables are dependent. In order to achieve our main result, meanwhile a lower bound for the Fisher information of the output signal is obtained which is useful on its own. An example is also provided to illustrate our result.

Date: 2022
References: Add references at CitEc
Citations:

Downloads: (external link)
http://hdl.handle.net/10.1080/03610926.2020.1813305 (text/html)
Access to full text is restricted to subscribers.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:taf:lstaxx:v:51:y:2022:i:13:p:4358-4369

Ordering information: This journal article can be ordered from
http://www.tandfonline.com/pricing/journal/lsta20

DOI: 10.1080/03610926.2020.1813305

Access Statistics for this article

Communications in Statistics - Theory and Methods is currently edited by Debbie Iscoe

More articles in Communications in Statistics - Theory and Methods from Taylor & Francis Journals
Bibliographic data for series maintained by Chris Longhurst ().

 
Page updated 2025-03-20
Handle: RePEc:taf:lstaxx:v:51:y:2022:i:13:p:4358-4369