- Source: Uncorrelatedness (probability theory)
In probability theory and statistics, two real-valued random variables,
X
{\displaystyle X}
,
Y
{\displaystyle Y}
, are said to be uncorrelated if their covariance,
cov
[
X
,
Y
]
=
E
[
X
Y
]
−
E
[
X
]
E
[
Y
]
{\displaystyle \operatorname {cov} [X,Y]=\operatorname {E} [XY]-\operatorname {E} [X]\operatorname {E} [Y]}
, is zero. If two variables are uncorrelated, there is no linear relationship between them.
Uncorrelated random variables have a Pearson correlation coefficient, when it exists, of zero, except in the trivial case when either variable has zero variance (is a constant). In this case the correlation is undefined.
In general, uncorrelatedness is not the same as orthogonality, except in the special case where at least one of the two random variables has an expected value of 0. In this case, the covariance is the expectation of the product, and
X
{\displaystyle X}
and
Y
{\displaystyle Y}
are uncorrelated if and only if
E
[
X
Y
]
=
0
{\displaystyle \operatorname {E} [XY]=0}
.
If
X
{\displaystyle X}
and
Y
{\displaystyle Y}
are independent, with finite second moments, then they are uncorrelated. However, not all uncorrelated variables are independent.: p. 155
Definition
= Definition for two real random variables
=Two random variables
X
,
Y
{\displaystyle X,Y}
are called uncorrelated if their covariance
Cov
[
X
,
Y
]
=
E
[
(
X
−
E
[
X
]
)
(
Y
−
E
[
Y
]
)
]
{\displaystyle \operatorname {Cov} [X,Y]=\operatorname {E} [(X-\operatorname {E} [X])(Y-\operatorname {E} [Y])]}
is zero.: p. 153 : p. 121 Formally:
= Definition for two complex random variables
=Two complex random variables
Z
,
W
{\displaystyle Z,W}
are called uncorrelated if their covariance
K
Z
W
=
E
[
(
Z
−
E
[
Z
]
)
(
W
−
E
[
W
]
)
¯
]
{\displaystyle \operatorname {K} _{ZW}=\operatorname {E} [(Z-\operatorname {E} [Z]){\overline {(W-\operatorname {E} [W])}}]}
and their pseudo-covariance
J
Z
W
=
E
[
(
Z
−
E
[
Z
]
)
(
W
−
E
[
W
]
)
]
{\displaystyle \operatorname {J} _{ZW}=\operatorname {E} [(Z-\operatorname {E} [Z])(W-\operatorname {E} [W])]}
is zero, i.e.
Z
,
W
uncorrelated
⟺
E
[
Z
W
¯
]
=
E
[
Z
]
⋅
E
[
W
¯
]
and
E
[
Z
W
]
=
E
[
Z
]
⋅
E
[
W
]
{\displaystyle Z,W{\text{ uncorrelated}}\quad \iff \quad \operatorname {E} [Z{\overline {W}}]=\operatorname {E} [Z]\cdot \operatorname {E} [{\overline {W}}]{\text{ and }}\operatorname {E} [ZW]=\operatorname {E} [Z]\cdot \operatorname {E} [W]}
= Definition for more than two random variables
=A set of two or more random variables
X
1
,
…
,
X
n
{\displaystyle X_{1},\ldots ,X_{n}}
is called uncorrelated if each pair of them is uncorrelated. This is equivalent to the requirement that the non-diagonal elements of the autocovariance matrix
K
X
X
{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {X} }}
of the random vector
X
=
[
X
1
…
X
n
]
T
{\displaystyle \mathbf {X} =[X_{1}\ldots X_{n}]^{\mathrm {T} }}
are all zero. The autocovariance matrix is defined as:
K
X
X
=
cov
[
X
,
X
]
=
E
[
(
X
−
E
[
X
]
)
(
X
−
E
[
X
]
)
)
T
]
=
E
[
X
X
T
]
−
E
[
X
]
E
[
X
]
T
{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {X} }=\operatorname {cov} [\mathbf {X} ,\mathbf {X} ]=\operatorname {E} [(\mathbf {X} -\operatorname {E} [\mathbf {X} ])(\mathbf {X} -\operatorname {E} [\mathbf {X} ]))^{\rm {T}}]=\operatorname {E} [\mathbf {X} \mathbf {X} ^{T}]-\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {X} ]^{T}}
Examples of dependence without correlation
= Example 1
=Let
X
{\displaystyle X}
be a random variable that takes the value 0 with probability 1/2, and takes the value 1 with probability 1/2.
Let
Y
{\displaystyle Y}
be a random variable, independent of
X
{\displaystyle X}
, that takes the value −1 with probability 1/2, and takes the value 1 with probability 1/2.
Let
U
{\displaystyle U}
be a random variable constructed as
U
=
X
Y
{\displaystyle U=XY}
.
The claim is that
U
{\displaystyle U}
and
X
{\displaystyle X}
have zero covariance (and thus are uncorrelated), but are not independent.
Proof:
Taking into account that
E
[
U
]
=
E
[
X
Y
]
=
E
[
X
]
E
[
Y
]
=
E
[
X
]
⋅
0
=
0
,
{\displaystyle \operatorname {E} [U]=\operatorname {E} [XY]=\operatorname {E} [X]\operatorname {E} [Y]=\operatorname {E} [X]\cdot 0=0,}
where the second equality holds because
X
{\displaystyle X}
and
Y
{\displaystyle Y}
are independent, one gets
cov
[
U
,
X
]
=
E
[
(
U
−
E
[
U
]
)
(
X
−
E
[
X
]
)
]
=
E
[
U
(
X
−
1
2
)
]
=
E
[
X
2
Y
−
1
2
X
Y
]
=
E
[
(
X
2
−
1
2
X
)
Y
]
=
E
[
(
X
2
−
1
2
X
)
]
E
[
Y
]
=
0
{\displaystyle {\begin{aligned}\operatorname {cov} [U,X]&=\operatorname {E} [(U-\operatorname {E} [U])(X-\operatorname {E} [X])]=\operatorname {E} [U(X-{\tfrac {1}{2}})]\\&=\operatorname {E} [X^{2}Y-{\tfrac {1}{2}}XY]=\operatorname {E} [(X^{2}-{\tfrac {1}{2}}X)Y]=\operatorname {E} [(X^{2}-{\tfrac {1}{2}}X)]\operatorname {E} [Y]=0\end{aligned}}}
Therefore,
U
{\displaystyle U}
and
X
{\displaystyle X}
are uncorrelated.
Independence of
U
{\displaystyle U}
and
X
{\displaystyle X}
means that for all
a
{\displaystyle a}
and
b
{\displaystyle b}
,
Pr
(
U
=
a
∣
X
=
b
)
=
Pr
(
U
=
a
)
{\displaystyle \Pr(U=a\mid X=b)=\Pr(U=a)}
. This is not true, in particular, for
a
=
1
{\displaystyle a=1}
and
b
=
0
{\displaystyle b=0}
.
Pr
(
U
=
1
∣
X
=
0
)
=
Pr
(
X
Y
=
1
∣
X
=
0
)
=
0
{\displaystyle \Pr(U=1\mid X=0)=\Pr(XY=1\mid X=0)=0}
Pr
(
U
=
1
)
=
Pr
(
X
Y
=
1
)
=
1
/
4
{\displaystyle \Pr(U=1)=\Pr(XY=1)=1/4}
Thus
Pr
(
U
=
1
∣
X
=
0
)
≠
Pr
(
U
=
1
)
{\displaystyle \Pr(U=1\mid X=0)\neq \Pr(U=1)}
so
U
{\displaystyle U}
and
X
{\displaystyle X}
are not independent.
Q.E.D.
= Example 2
=If
X
{\displaystyle X}
is a continuous random variable uniformly distributed on
[
−
1
,
1
]
{\displaystyle [-1,1]}
and
Y
=
X
2
{\displaystyle Y=X^{2}}
, then
X
{\displaystyle X}
and
Y
{\displaystyle Y}
are uncorrelated even though
X
{\displaystyle X}
determines
Y
{\displaystyle Y}
and a particular value of
Y
{\displaystyle Y}
can be produced by only one or two values of
X
{\displaystyle X}
:
f
X
(
t
)
=
1
2
I
[
−
1
,
1
]
;
f
Y
(
t
)
=
1
2
t
I
]
0
,
1
]
{\displaystyle f_{X}(t)={1 \over 2}I_{[-1,1]};f_{Y}(t)={1 \over {2{\sqrt {t}}}}I_{]0,1]}}
on the other hand,
f
X
,
Y
{\displaystyle f_{X,Y}}
is 0 on the triangle defined by
0
<
X
<
Y
<
1
{\displaystyle 0
although
f
X
×
f
Y
{\displaystyle f_{X}\times f_{Y}}
is not null on this domain.
Therefore
f
X
,
Y
(
X
,
Y
)
≠
f
X
(
X
)
×
f
Y
(
Y
)
{\displaystyle f_{X,Y}(X,Y)\neq f_{X}(X)\times f_{Y}(Y)}
and the variables are not independent.
E
[
X
]
=
1
−
1
4
=
0
;
E
[
Y
]
=
1
3
−
(
−
1
)
3
3
×
2
=
1
3
{\displaystyle E[X]={{1-1} \over 4}=0;E[Y]={{1^{3}-(-1)^{3}} \over {3\times 2}}={1 \over 3}}
C
o
v
[
X
,
Y
]
=
E
[
(
X
−
E
[
X
]
)
(
Y
−
E
[
Y
]
)
]
=
E
[
X
3
−
X
3
]
=
1
4
−
(
−
1
)
4
4
×
2
=
0
{\displaystyle Cov[X,Y]=E\left[(X-E[X])(Y-E[Y])\right]=E\left[X^{3}-{X \over 3}\right]={{1^{4}-(-1)^{4}} \over {4\times 2}}=0}
Therefore the variables are uncorrelated.
When uncorrelatedness implies independence
There are cases in which uncorrelatedness does imply independence. One of these cases is the one in which both random variables are two-valued (so each can be linearly transformed to have a Bernoulli distribution). Further, two jointly normally distributed random variables are independent if they are uncorrelated, although this does not hold for variables whose marginal distributions are normal and uncorrelated but whose joint distribution is not joint normal (see Normally distributed and uncorrelated does not imply independent).
Generalizations
= Uncorrelated random vectors
=Two random vectors
X
=
(
X
1
,
…
,
X
m
)
T
{\displaystyle \mathbf {X} =(X_{1},\ldots ,X_{m})^{T}}
and
Y
=
(
Y
1
,
…
,
Y
n
)
T
{\displaystyle \mathbf {Y} =(Y_{1},\ldots ,Y_{n})^{T}}
are called uncorrelated if
E
[
X
Y
T
]
=
E
[
X
]
E
[
Y
]
T
{\displaystyle \operatorname {E} [\mathbf {X} \mathbf {Y} ^{T}]=\operatorname {E} [\mathbf {X} ]\operatorname {E} [\mathbf {Y} ]^{T}}
.
They are uncorrelated if and only if their cross-covariance matrix
K
X
Y
{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }}
is zero.: p.337
Two complex random vectors
Z
{\displaystyle \mathbf {Z} }
and
W
{\displaystyle \mathbf {W} }
are called uncorrelated if their cross-covariance matrix and their pseudo-cross-covariance matrix is zero, i.e. if
K
Z
W
=
J
Z
W
=
0
{\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {J} _{\mathbf {Z} \mathbf {W} }=0}
where
K
Z
W
=
E
[
(
Z
−
E
[
Z
]
)
(
W
−
E
[
W
]
)
H
]
{\displaystyle \operatorname {K} _{\mathbf {Z} \mathbf {W} }=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ]){(\mathbf {W} -\operatorname {E} [\mathbf {W} ])}^{\mathrm {H} }]}
and
J
Z
W
=
E
[
(
Z
−
E
[
Z
]
)
(
W
−
E
[
W
]
)
T
]
{\displaystyle \operatorname {J} _{\mathbf {Z} \mathbf {W} }=\operatorname {E} [(\mathbf {Z} -\operatorname {E} [\mathbf {Z} ]){(\mathbf {W} -\operatorname {E} [\mathbf {W} ])}^{\mathrm {T} }]}
.
= Uncorrelated stochastic processes
=Two stochastic processes
{
X
t
}
{\displaystyle \left\{X_{t}\right\}}
and
{
Y
t
}
{\displaystyle \left\{Y_{t}\right\}}
are called uncorrelated if their cross-covariance
K
X
Y
(
t
1
,
t
2
)
=
E
[
(
X
(
t
1
)
−
μ
X
(
t
1
)
)
(
Y
(
t
2
)
−
μ
Y
(
t
2
)
)
]
{\displaystyle \operatorname {K} _{\mathbf {X} \mathbf {Y} }(t_{1},t_{2})=\operatorname {E} \left[\left(X(t_{1})-\mu _{X}(t_{1})\right)\left(Y(t_{2})-\mu _{Y}(t_{2})\right)\right]}
is zero for all times.: p. 142 Formally:
{
X
t
}
,
{
Y
t
}
uncorrelated
:
⟺
∀
t
1
,
t
2
:
K
X
Y
(
t
1
,
t
2
)
=
0
{\displaystyle \left\{X_{t}\right\},\left\{Y_{t}\right\}{\text{ uncorrelated}}\quad :\iff \quad \forall t_{1},t_{2}\colon \operatorname {K} _{\mathbf {X} \mathbf {Y} }(t_{1},t_{2})=0}
.
See also
Correlation and dependence
Binomial distribution: Covariance between two binomials
Uncorrelated Volume Element
References
Further reading
Probability for Statisticians, Galen R. Shorack, Springer (c2000) ISBN 0-387-98953-6
Kata Kunci Pencarian:
- Uncorrelatedness (probability theory)
- Independence (probability theory)
- Stochastic process
- Coding theory
- List of statistics articles
- Outline of probability
- Rate–distortion theory
- Misconceptions about the normal distribution
- List of probability topics
- Modern portfolio theory