Jump to content
 







Main menu
   


Navigation  



Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Donate
 




Contribute  



Help
Learn to edit
Community portal
Recent changes
Upload file
 








Search  

































Create account

Log in
 









Create account
 Log in
 




Pages for logged out editors learn more  



Contributions
Talk
 



















Contents

   



(Top)
 


1 Statement of the inequality  





2 Alternative form of the inequality  





3 See also  





4 References  














Entropy power inequality







Add links
 









Article
Talk
 

















Read
Edit
View history
 








Tools
   


Actions  



Read
Edit
View history
 




General  



What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Cite this page
Get shortened URL
Download QR code
Wikidata item
 




Print/export  



Download as PDF
Printable version
 
















Appearance
   

 






From Wikipedia, the free encyclopedia
 


Ininformation theory, the entropy power inequality (EPI) is a result that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication". Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary.

Statement of the inequality[edit]

For a random vector X : Ω → Rn with probability density function f : Rn → R, the differential entropyofX, denoted h(X), is defined to be

and the entropy power of X, denoted N(X), is defined to be

In particular, N(X) = |K| 1/n when X is normal distributed with covariance matrix K.

Let X and Ybeindependent random variables with probability density functions in the Lp space Lp(Rn) for some p > 1. Then

Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices.

Alternative form of the inequality[edit]

The entropy power inequality can be rewritten in an equivalent form that does not explicitly depend on the definition of entropy power (see Costa and Cover reference below).

Let X and Ybeindependent random variables, as above. Then, let X' and Y' be independently distributed random variables with gaussian distributions, such that

and

Then,

See also[edit]

References[edit]


Retrieved from "https://en.wikipedia.org/w/index.php?title=Entropy_power_inequality&oldid=1143884291"

Categories: 
Information theory
Probabilistic inequalities
Statistical inequalities
 



This page was last edited on 10 March 2023, at 14:04 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License 4.0; additional terms may apply. By using this site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc., a non-profit organization.



Privacy policy

About Wikipedia

Disclaimers

Contact Wikipedia

Code of Conduct

Developers

Statistics

Cookie statement

Mobile view



Wikimedia Foundation
Powered by MediaWiki