- Source: Complex random variable
In probability theory and statistics, complex random variables are a generalization of real-valued random variables to complex numbers, i.e. the possible values a complex random variable may take are complex numbers. Complex random variables can always be considered as pairs of real random variables: their real and imaginary parts. Therefore, the distribution of one complex random variable may be interpreted as the joint distribution of two real random variables.
Some concepts of real random variables have a straightforward generalization to complex random variables—e.g., the definition of the mean of a complex random variable. Other concepts are unique to complex random variables.
Applications of complex random variables are found in digital signal processing, quadrature amplitude modulation and information theory.
Definition
A complex random variable
Z
{\displaystyle Z}
on the probability space
(
Ω
,
F
,
P
)
{\displaystyle (\Omega ,{\mathcal {F}},P)}
is a function
Z
:
Ω
→
C
{\displaystyle Z\colon \Omega \rightarrow \mathbb {C} }
such that both its real part
ℜ
(
Z
)
{\displaystyle \Re {(Z)}}
and its imaginary part
ℑ
(
Z
)
{\displaystyle \Im {(Z)}}
are real random variables on
(
Ω
,
F
,
P
)
{\displaystyle (\Omega ,{\mathcal {F}},P)}
.
Examples
= Simple example
=Consider a random variable that may take only the three complex values
1
+
i
,
1
−
i
,
2
{\displaystyle 1+i,1-i,2}
with probabilities as specified in the table. This is a simple example of a complex random variable.
The expectation of this random variable may be simply calculated:
E
[
Z
]
=
1
4
(
1
+
i
)
+
1
4
(
1
−
i
)
+
1
2
2
=
3
2
.
{\displaystyle \operatorname {E} [Z]={\frac {1}{4}}(1+i)+{\frac {1}{4}}(1-i)+{\frac {1}{2}}2={\frac {3}{2}}.}
= Uniform distribution
=Another example of a complex random variable is the uniform distribution over the filled unit circle, i.e. the set
{
z
∈
C
∣
|
z
|
≤
1
}
{\displaystyle \{z\in \mathbb {C} \mid |z|\leq 1\}}
. This random variable is an example of a complex random variable for which the probability density function is defined. The density function is shown as the yellow disk and dark blue base in the following figure.
= Complex normal distribution
=Complex Gaussian random variables are often encountered in applications. They are a straightforward generalization of real Gaussian random variables. The following plot shows an example of the distribution of such a variable.
Cumulative distribution function
The generalization of the cumulative distribution function from real to complex random variables is not obvious because expressions of the form
P
(
Z
≤
1
+
3
i
)
{\displaystyle P(Z\leq 1+3i)}
make no sense. However expressions of the form
P
(
ℜ
(
Z
)
≤
1
,
ℑ
(
Z
)
≤
3
)
{\displaystyle P(\Re {(Z)}\leq 1,\Im {(Z)}\leq 3)}
make sense. Therefore, we define the cumulative distribution
F
Z
:
C
→
[
0
,
1
]
{\displaystyle F_{Z}:\mathbb {C} \to [0,1]}
of a complex random variables via the joint distribution of their real and imaginary parts:
Probability density function
The probability density function of a complex random variable is defined as
f
Z
(
z
)
=
f
ℜ
(
Z
)
,
ℑ
(
Z
)
(
ℜ
(
z
)
,
ℑ
(
z
)
)
{\displaystyle f_{Z}(z)=f_{\Re {(Z)},\Im {(Z)}}(\Re {(z)},\Im {(z)})}
, i.e. the value of the density function at a point
z
∈
C
{\displaystyle z\in \mathbb {C} }
is defined to be equal to the value of the joint density of the real and imaginary parts of the random variable evaluated at the point
(
ℜ
(
z
)
,
ℑ
(
z
)
)
{\displaystyle (\Re {(z)},\Im {(z)})}
.
An equivalent definition is given by
f
Z
(
z
)
=
∂
2
∂
x
∂
y
P
(
ℜ
(
Z
)
≤
x
,
ℑ
(
Z
)
≤
y
)
{\displaystyle f_{Z}(z)={\frac {\partial ^{2}}{\partial x\partial y}}P(\Re {(Z)}\leq x,\Im {(Z)}\leq y)}
where
x
=
ℜ
(
z
)
{\displaystyle x=\Re {(z)}}
and
y
=
ℑ
(
z
)
{\displaystyle y=\Im {(z)}}
.
As in the real case the density function may not exist.
Expectation
The expectation of a complex random variable is defined based on the definition of the expectation of a real random variable:: p. 112
Note that the expectation of a complex random variable does not exist if
E
[
ℜ
(
Z
)
]
{\displaystyle \operatorname {E} [\Re {(Z)}]}
or
E
[
ℑ
(
Z
)
]
{\displaystyle \operatorname {E} [\Im {(Z)}]}
does not exist.
If the complex random variable
Z
{\displaystyle Z}
has a probability density function
f
Z
(
z
)
{\displaystyle f_{Z}(z)}
, then the expectation is given by
E
[
Z
]
=
∬
C
z
⋅
f
Z
(
z
)
d
x
d
y
{\displaystyle \operatorname {E} [Z]=\iint _{\mathbb {C} }z\cdot f_{Z}(z)\,dx\,dy}
.
If the complex random variable
Z
{\displaystyle Z}
has a probability mass function
p
Z
(
z
)
{\displaystyle p_{Z}(z)}
, then the expectation is given by
E
[
Z
]
=
∑
z
∈
Z
z
⋅
p
Z
(
z
)
{\displaystyle \operatorname {E} [Z]=\sum _{z\in \mathbb {Z} }z\cdot p_{Z}(z)}
.
Properties
Whenever the expectation of a complex random variable exists, taking the expectation and complex conjugation commute:
E
[
Z
]
¯
=
E
[
Z
¯
]
.
{\displaystyle {\overline {\operatorname {E} [Z]}}=\operatorname {E} [{\overline {Z}}].}
The expected value operator
E
[
⋅
]
{\displaystyle \operatorname {E} [\cdot ]}
is linear in the sense that
E
[
a
Z
+
b
W
]
=
a
E
[
Z
]
+
b
E
[
W
]
{\displaystyle \operatorname {E} [aZ+bW]=a\operatorname {E} [Z]+b\operatorname {E} [W]}
for any complex coefficients
a
,
b
{\displaystyle a,b}
even if
Z
{\displaystyle Z}
and
W
{\displaystyle W}
are not independent.
Variance and pseudo-variance
The variance is defined in terms of absolute squares as:: 117
Properties
The variance is always a nonnegative real number. It is equal to the sum of the variances of the real and imaginary part of the complex random variable:
Var
[
Z
]
=
Var
[
ℜ
(
Z
)
]
+
Var
[
ℑ
(
Z
)
]
.
{\displaystyle \operatorname {Var} [Z]=\operatorname {Var} [\Re {(Z)}]+\operatorname {Var} [\Im {(Z)}].}
The variance of a linear combination of complex random variables may be calculated using the following formula:
Var
[
∑
k
=
1
N
a
k
Z
k
]
=
∑
i
=
1
N
∑
j
=
1
N
a
i
a
j
¯
Cov
[
Z
i
,
Z
j
]
.
{\displaystyle \operatorname {Var} \left[\sum _{k=1}^{N}a_{k}Z_{k}\right]=\sum _{i=1}^{N}\sum _{j=1}^{N}a_{i}{\overline {a_{j}}}\operatorname {Cov} [Z_{i},Z_{j}].}
= Pseudo-variance
=The pseudo-variance is a special case of the pseudo-covariance and is defined in terms of ordinary complex squares, given by:
Unlike the variance of
Z
{\displaystyle Z}
, which is always real and positive, the pseudo-variance of
Z
{\displaystyle Z}
is in general complex.
Covariance matrix of real and imaginary parts
For a general complex random variable, the pair
(
ℜ
(
Z
)
,
ℑ
(
Z
)
)
{\displaystyle (\Re {(Z)},\Im {(Z)})}
has a covariance matrix of the form:
[
Var
[
ℜ
(
Z
)
]
Cov
[
ℑ
(
Z
)
,
ℜ
(
Z
)
]
Cov
[
ℜ
(
Z
)
,
ℑ
(
Z
)
]
Var
[
ℑ
(
Z
)
]
]
{\displaystyle {\begin{bmatrix}\operatorname {Var} [\Re {(Z)}]&\operatorname {Cov} [\Im {(Z)},\Re {(Z)}]\\\operatorname {Cov} [\Re {(Z)},\Im {(Z)}]&\operatorname {Var} [\Im {(Z)}]\end{bmatrix}}}
The matrix is symmetric, so
Cov
[
ℜ
(
Z
)
,
ℑ
(
Z
)
]
=
Cov
[
ℑ
(
Z
)
,
ℜ
(
Z
)
]
{\displaystyle \operatorname {Cov} [\Re {(Z)},\Im {(Z)}]=\operatorname {Cov} [\Im {(Z)},\Re {(Z)}]}
Its elements equal:
Var
[
ℜ
(
Z
)
]
=
1
2
Re
(
K
Z
Z
+
J
Z
Z
)
Var
[
ℑ
(
Z
)
]
=
1
2
Re
(
K
Z
Z
−
J
Z
Z
)
Cov
[
ℜ
(
Z
)
,
ℑ
(
Z
)
]
=
1
2
Im
(
J
Z
Z
)
{\displaystyle {\begin{aligned}&\operatorname {Var} [\Re {(Z)}]={\tfrac {1}{2}}\operatorname {Re} (\operatorname {K} _{ZZ}+\operatorname {J} _{ZZ})\\&\operatorname {Var} [\Im {(Z)}]={\tfrac {1}{2}}\operatorname {Re} (\operatorname {K} _{ZZ}-\operatorname {J} _{ZZ})\\&\operatorname {Cov} [\Re {(Z)},\Im {(Z)}]={\tfrac {1}{2}}\operatorname {Im} (\operatorname {J} _{ZZ})\\\end{aligned}}}
Conversely:
K
Z
Z
=
Var
[
ℜ
(
Z
)
]
+
Var
[
ℑ
(
Z
)
]
J
Z
Z
=
Var
[
ℜ
(
Z
)
]
−
Var
[
ℑ
(
Z
)
]
+
i
2
Cov
[
ℜ
(
Z
)
,
ℑ
(
Z
)
]
{\displaystyle {\begin{aligned}&\operatorname {K} _{ZZ}=\operatorname {Var} [\Re {(Z)}]+\operatorname {Var} [\Im {(Z)}]\\&\operatorname {J} _{ZZ}=\operatorname {Var} [\Re {(Z)}]-\operatorname {Var} [\Im {(Z)}]+i2\operatorname {Cov} [\Re {(Z)},\Im {(Z)}]\end{aligned}}}
Covariance and pseudo-covariance
The covariance between two complex random variables
Z
,
W
{\displaystyle Z,W}
is defined as: 119
Notice the complex conjugation of the second factor in the definition.
In contrast to real random variables, we also define a pseudo-covariance (also called complementary variance):
The second order statistics are fully characterized by the covariance and the pseudo-covariance.
Properties
The covariance has the following properties:
Cov
[
Z
,
W
]
=
Cov
[
W
,
Z
]
¯
{\displaystyle \operatorname {Cov} [Z,W]={\overline {\operatorname {Cov} [W,Z]}}}
(Conjugate symmetry)
Cov
[
α
Z
,
W
]
=
α
Cov
[
Z
,
W
]
{\displaystyle \operatorname {Cov} [\alpha Z,W]=\alpha \operatorname {Cov} [Z,W]}
(Sesquilinearity)
Cov
[
Z
,
α
W
]
=
α
¯
Cov
[
Z
,
W
]
{\displaystyle \operatorname {Cov} [Z,\alpha W]={\overline {\alpha }}\operatorname {Cov} [Z,W]}
Cov
[
Z
1
+
Z
2
,
W
]
=
Cov
[
Z
1
,
W
]
+
Cov
[
Z
2
,
W
]
{\displaystyle \operatorname {Cov} [Z_{1}+Z_{2},W]=\operatorname {Cov} [Z_{1},W]+\operatorname {Cov} [Z_{2},W]}
Cov
[
Z
,
W
1
+
W
2
]
=
Cov
[
Z
,
W
1
]
+
Cov
[
Z
,
W
2
]
{\displaystyle \operatorname {Cov} [Z,W_{1}+W_{2}]=\operatorname {Cov} [Z,W_{1}]+\operatorname {Cov} [Z,W_{2}]}
Cov
[
Z
,
Z
]
=
Var
[
Z
]
{\displaystyle \operatorname {Cov} [Z,Z]={\operatorname {Var} [Z]}}
Uncorrelatedness: two complex random variables
Z
{\displaystyle Z}
and
W
{\displaystyle W}
are called uncorrelated if
K
Z
W
=
J
Z
W
=
0
{\displaystyle \operatorname {K} _{ZW}=\operatorname {J} _{ZW}=0}
(see also: uncorrelatedness (probability theory)).
Orthogonality: two complex random variables
Z
{\displaystyle Z}
and
W
{\displaystyle W}
are called orthogonal if
E
[
Z
W
¯
]
=
0
{\displaystyle \operatorname {E} [Z{\overline {W}}]=0}
.
Circular symmetry
Circular symmetry of complex random variables is a common assumption used in the field of wireless communication. A typical example of a circular symmetric complex random variable is the complex Gaussian random variable with zero mean and zero pseudo-covariance matrix.
A complex random variable
Z
{\displaystyle Z}
is circularly symmetric if, for any deterministic
ϕ
∈
[
−
π
,
π
]
{\displaystyle \phi \in [-\pi ,\pi ]}
, the distribution of
e
i
ϕ
Z
{\displaystyle e^{\mathrm {i} \phi }Z}
equals the distribution of
Z
{\displaystyle Z}
.
Properties
By definition, a circularly symmetric complex random variable has
E
[
Z
]
=
E
[
e
i
ϕ
Z
]
=
e
i
ϕ
E
[
Z
]
{\displaystyle \operatorname {E} [Z]=\operatorname {E} [e^{\mathrm {i} \phi }Z]=e^{\mathrm {i} \phi }\operatorname {E} [Z]}
for any
ϕ
{\displaystyle \phi }
.
Thus the expectation of a circularly symmetric complex random variable can only be either zero or undefined.
Additionally,
E
[
Z
Z
]
=
E
[
e
i
ϕ
Z
e
i
ϕ
Z
]
=
e
2
i
ϕ
E
[
Z
Z
]
{\displaystyle \operatorname {E} [ZZ]=\operatorname {E} [e^{\mathrm {i} \phi }Ze^{\mathrm {i} \phi }Z]=e^{\mathrm {2} i\phi }\operatorname {E} [ZZ]}
for any
ϕ
{\displaystyle \phi }
.
Thus the pseudo-variance of a circularly symmetric complex random variable can only be zero.
If
Z
{\displaystyle Z}
and
e
i
ϕ
Z
{\displaystyle e^{\mathrm {i} \phi }Z}
have the same distribution, the phase of
Z
{\displaystyle Z}
must be uniformly distributed over
[
−
π
,
π
]
{\displaystyle [-\pi ,\pi ]}
and independent of the amplitude of
Z
{\displaystyle Z}
.
Proper complex random variables
The concept of proper random variables is unique to complex random variables, and has no correspondent concept with real random variables.
A complex random variable
Z
{\displaystyle Z}
is called proper if the following three conditions are all satisfied:
E
[
Z
]
=
0
{\displaystyle \operatorname {E} [Z]=0}
Var
[
Z
]
<
∞
{\displaystyle \operatorname {Var} [Z]<\infty }
E
[
Z
2
]
=
0
{\displaystyle \operatorname {E} [Z^{2}]=0}
This definition is equivalent to the following conditions. This means that a complex random variable is proper if, and only if:
E
[
Z
]
=
0
{\displaystyle \operatorname {E} [Z]=0}
E
[
ℜ
(
Z
)
2
]
=
E
[
ℑ
(
Z
)
2
]
≠
∞
{\displaystyle \operatorname {E} [\Re {(Z)}^{2}]=\operatorname {E} [\Im {(Z)}^{2}]\neq \infty }
E
[
ℜ
(
Z
)
ℑ
(
Z
)
]
=
0
{\displaystyle \operatorname {E} [\Re {(Z)}\Im {(Z)}]=0}
For a proper complex random variable, the covariance matrix of the pair
(
ℜ
(
Z
)
,
ℑ
(
Z
)
)
{\displaystyle (\Re {(Z)},\Im {(Z)})}
has the following simple form:
[
1
2
Var
[
Z
]
0
0
1
2
Var
[
Z
]
]
{\displaystyle {\begin{bmatrix}{\frac {1}{2}}\operatorname {Var} [Z]&0\\0&{\frac {1}{2}}\operatorname {Var} [Z]\end{bmatrix}}}
.
I.e.:
Var
[
ℜ
(
Z
)
]
=
Var
[
ℑ
(
Z
)
]
=
1
2
Var
[
Z
]
Cov
[
ℜ
(
Z
)
,
ℑ
(
Z
)
]
=
0
{\displaystyle {\begin{aligned}&\operatorname {Var} [\Re {(Z)}]=\operatorname {Var} [\Im {(Z)}]={\tfrac {1}{2}}\operatorname {Var} [Z]\\&\operatorname {Cov} [\Re {(Z)},\Im {(Z)}]=0\\\end{aligned}}}
Cauchy-Schwarz inequality
The Cauchy-Schwarz inequality for complex random variables, which can be derived using the Triangle inequality and Hölder's inequality, is
|
E
[
Z
W
¯
]
|
2
≤
|
E
[
|
Z
W
¯
|
]
|
2
≤
E
[
|
Z
|
2
]
E
[
|
W
|
2
]
{\displaystyle \left|\operatorname {E} \left[Z{\overline {W}}\right]\right|^{2}\leq \left|\operatorname {E} \left[\left|Z{\overline {W}}\right|\right]\right|^{2}\leq \operatorname {E} \left[|Z|^{2}\right]\operatorname {E} \left[|W|^{2}\right]}
.
Characteristic function
The characteristic function of a complex random variable is a function
C
→
C
{\displaystyle \mathbb {C} \to \mathbb {C} }
defined by
φ
Z
(
ω
)
=
E
[
e
i
ℜ
(
ω
¯
Z
)
]
=
E
[
e
i
(
ℜ
(
ω
)
ℜ
(
Z
)
+
ℑ
(
ω
)
ℑ
(
Z
)
)
]
.
{\displaystyle \varphi _{Z}(\omega )=\operatorname {E} \left[e^{i\Re {({\overline {\omega }}Z)}}\right]=\operatorname {E} \left[e^{i(\Re {(\omega )}\Re {(Z)}+\Im {(\omega )}\Im {(Z)})}\right].}
See also
Central moment
Complex random vector
References
Kata Kunci Pencarian:
- Persamaan Schröder
- Daftar tetapan matematis
- Complex random variable
- Random variable
- Complex normal distribution
- Algebra of random variables
- Uncorrelatedness (probability theory)
- Cumulative distribution function
- Multivariate random variable
- Central moment
- Independent and identically distributed random variables
- Complex random vector