- Source: Entropy power inequality
In information theory, the entropy power inequality (EPI) is a result that relates to so-called "entropy power" of random variables. It shows that the entropy power of suitably well-behaved random variables is a superadditive function. The entropy power inequality was proved in 1948 by Claude Shannon in his seminal paper "A Mathematical Theory of Communication". Shannon also provided a sufficient condition for equality to hold; Stam (1959) showed that the condition is in fact necessary.
Statement of the inequality
For a random vector X : Ω → Rn with probability density function f : Rn → R, the differential entropy of X, denoted h(X), is defined to be
h
(
X
)
=
−
∫
R
n
f
(
x
)
log
f
(
x
)
d
x
{\displaystyle h(X)=-\int _{\mathbb {R} ^{n}}f(x)\log f(x)\,dx}
and the entropy power of X, denoted N(X), is defined to be
N
(
X
)
=
1
2
π
e
e
2
n
h
(
X
)
.
{\displaystyle N(X)={\frac {1}{2\pi e}}e^{{\frac {2}{n}}h(X)}.}
In particular, N(X) = |K| 1/n when X is normal distributed with covariance matrix K.
Let X and Y be independent random variables with probability density functions in the Lp space Lp(Rn) for some p > 1. Then
N
(
X
+
Y
)
≥
N
(
X
)
+
N
(
Y
)
.
{\displaystyle N(X+Y)\geq N(X)+N(Y).\,}
Moreover, equality holds if and only if X and Y are multivariate normal random variables with proportional covariance matrices.
Alternative form of the inequality
The entropy power inequality can be rewritten in an equivalent form that does not explicitly depend on the definition of entropy power (see Costa and Cover reference below).
Let X and Y be independent random variables, as above. Then, let X' and Y' be independently distributed random variables with gaussian distributions, such that
h
(
X
′
)
=
h
(
X
)
{\displaystyle h(X')=h(X)}
and
h
(
Y
′
)
=
h
(
Y
)
{\displaystyle h(Y')=h(Y)}
Then,
h
(
X
+
Y
)
≥
h
(
X
′
+
Y
′
)
{\displaystyle h(X+Y)\geq h(X'+Y')}
See also
Information entropy
Information theory
Limiting density of discrete points
Self-information
Kullback–Leibler divergence
Entropy estimation
References
Dembo, Amir; Cover, Thomas M.; Thomas, Joy A. (1991). "Information-theoretic inequalities". IEEE Trans. Inf. Theory. 37 (6): 1501–1518. doi:10.1109/18.104312. MR 1134291. S2CID 845669.
Costa, Max H. M.; Cover, Thomas M. (1984). "On the similarity of the entropy-power inequality and the Brunn-Minkowski inequality". IEEE Trans. Inf. Theory. 30 (6): 837–839. doi:10.1109/TIT.1984.1056983.
Gardner, Richard J. (2002). "The Brunn–Minkowski inequality". Bull. Amer. Math. Soc. (N.S.). 39 (3): 355–405 (electronic). doi:10.1090/S0273-0979-02-00941-2.
Shannon, Claude E. (1948). "A mathematical theory of communication". Bell System Tech. J. 27 (3): 379–423, 623–656. doi:10.1002/j.1538-7305.1948.tb01338.x. hdl:10338.dmlcz/101429.
Stam, A. J. (1959). "Some inequalities satisfied by the quantities of information of Fisher and Shannon". Information and Control. 2 (2): 101–112. doi:10.1016/S0019-9958(59)90348-1.
Kata Kunci Pencarian:
- Claude Shannon
- Entropy power inequality
- Conditional entropy
- Entropy (information theory)
- Entropic uncertainty
- EPI
- Fisher information
- Kullback–Leibler divergence
- Inequalities in information theory
- List of inequalities
- Claude Shannon