- Source: Multivariate stable distribution
The multivariate stable distribution is a multivariate probability distribution that is a multivariate generalisation of the univariate stable distribution. The multivariate stable distribution defines linear relations between stable distribution marginals. In the same way as for the univariate case, the distribution is defined in terms of its characteristic function.
The multivariate stable distribution can also be thought as an extension of the multivariate normal distribution. It has parameter, α, which is defined over the range 0 < α ≤ 2, and where the case α = 2 is equivalent to the multivariate normal distribution. It has an additional skew parameter that allows for non-symmetric distributions, where the multivariate normal distribution is symmetric.
Definition
Let
S
{\displaystyle \mathbb {S} }
be the unit sphere in
R
d
:
S
=
{
u
∈
R
d
:
|
u
|
=
1
}
{\displaystyle \mathbb {R} ^{d}\colon \mathbb {S} =\{u\in \mathbb {R} ^{d}\colon |u|=1\}}
. A random vector,
X
{\displaystyle X}
, has a multivariate stable distribution - denoted as
X
∼
S
(
α
,
Λ
,
δ
)
{\displaystyle X\sim S(\alpha ,\Lambda ,\delta )}
-, if the joint characteristic function of
X
{\displaystyle X}
is
E
exp
(
i
u
T
X
)
=
exp
{
−
∫
s
∈
S
{
|
u
T
s
|
α
+
i
ν
(
u
T
s
,
α
)
}
Λ
(
d
s
)
+
i
u
T
δ
}
{\displaystyle \operatorname {E} \exp(iu^{T}X)=\exp \left\{-\int \limits _{s\in \mathbb {S} }\left\{|u^{T}s|^{\alpha }+i\nu (u^{T}s,\alpha )\right\}\,\Lambda (ds)+iu^{T}\delta \right\}}
where 0 < α < 2, and for
y
∈
R
{\displaystyle y\in \mathbb {R} }
ν
(
y
,
α
)
=
{
−
s
i
g
n
(
y
)
tan
(
π
α
/
2
)
|
y
|
α
α
≠
1
,
(
2
/
π
)
y
ln
|
y
|
α
=
1.
{\displaystyle \nu (y,\alpha )={\begin{cases}-\mathbf {sign} (y)\tan(\pi \alpha /2)|y|^{\alpha }&\alpha \neq 1,\\(2/\pi )y\ln |y|&\alpha =1.\end{cases}}}
This is essentially the result of Feldheim, that any stable random vector can be characterized by a spectral measure
Λ
{\displaystyle \Lambda }
(a finite measure on
S
{\displaystyle \mathbb {S} }
) and a shift vector
δ
∈
R
d
{\displaystyle \delta \in \mathbb {R} ^{d}}
.
Parametrization using projections
Another way to describe a stable random vector is in terms of projections. For any vector
u
{\displaystyle u}
, the projection
u
T
X
{\displaystyle u^{T}X}
is univariate
α
−
{\displaystyle \alpha -}
stable with some skewness
β
(
u
)
{\displaystyle \beta (u)}
, scale
γ
(
u
)
{\displaystyle \gamma (u)}
and some shift
δ
(
u
)
{\displaystyle \delta (u)}
. The notation
X
∼
S
(
α
,
β
(
⋅
)
,
γ
(
⋅
)
,
δ
(
⋅
)
)
{\displaystyle X\sim S(\alpha ,\beta (\cdot ),\gamma (\cdot ),\delta (\cdot ))}
is used if X is stable with
u
T
X
∼
s
(
α
,
β
(
⋅
)
,
γ
(
⋅
)
,
δ
(
⋅
)
)
{\displaystyle u^{T}X\sim s(\alpha ,\beta (\cdot ),\gamma (\cdot ),\delta (\cdot ))}
for every
u
∈
R
d
{\displaystyle u\in \mathbb {R} ^{d}}
. This is called the projection parameterization.
The spectral measure determines the projection parameter functions by:
γ
(
u
)
=
(
∫
s
∈
S
|
u
T
s
|
α
Λ
(
d
s
)
)
1
/
α
{\displaystyle \gamma (u)={\Bigl (}\int _{s\in \mathbb {S} }|u^{T}s|^{\alpha }\Lambda (ds){\Bigr )}^{1/\alpha }}
β
(
u
)
=
∫
s
∈
S
|
u
T
s
|
α
s
i
g
n
(
u
T
s
)
Λ
(
d
s
)
/
γ
(
u
)
α
{\displaystyle \beta (u)=\int _{s\in \mathbb {S} }|u^{T}s|^{\alpha }\mathbf {sign} (u^{T}s)\Lambda (ds)/\gamma (u)^{\alpha }}
δ
(
u
)
=
{
u
T
δ
α
≠
1
u
T
δ
−
∫
s
∈
S
π
2
u
T
s
ln
|
u
T
s
|
Λ
(
d
s
)
α
=
1
{\displaystyle \delta (u)={\begin{cases}u^{T}\delta &\alpha \neq 1\\u^{T}\delta -\int _{s\in \mathbb {S} }{\tfrac {\pi }{2}}u^{T}s\ln |u^{T}s|\Lambda (ds)&\alpha =1\end{cases}}}
Special cases
There are special cases where the multivariate characteristic function takes a simpler form. Define the characteristic function of a stable marginal as
ω
(
y
|
α
,
β
)
=
{
|
y
|
α
[
1
−
i
β
(
tan
π
α
2
)
s
i
g
n
(
y
)
]
α
≠
1
|
y
|
[
1
+
i
β
2
π
s
i
g
n
(
y
)
ln
|
y
|
]
α
=
1
{\displaystyle \omega (y|\alpha ,\beta )={\begin{cases}|y|^{\alpha }\left[1-i\beta (\tan {\tfrac {\pi \alpha }{2}})\mathbf {sign} (y)\right]&\alpha \neq 1\\|y|\left[1+i\beta {\tfrac {2}{\pi }}\mathbf {sign} (y)\ln |y|\right]&\alpha =1\end{cases}}}
= Isotropic multivariate stable distribution
=The characteristic function is
E
exp
(
i
u
T
X
)
=
exp
{
−
γ
0
α
|
u
|
α
+
i
u
T
δ
)
}
{\displaystyle E\exp(iu^{T}X)=\exp\{-\gamma _{0}^{\alpha }|u|^{\alpha }+iu^{T}\delta )\}}
The spectral measure is continuous and uniform, leading to radial/isotropic symmetry.
For the multinormal case
α
=
2
{\displaystyle \alpha =2}
, this corresponds to independent components, but so is not the case when
α
<
2
{\displaystyle \alpha <2}
. Isotropy is a special case of ellipticity (see the next paragraph) – just take
Σ
{\displaystyle \Sigma }
to be a multiple of the identity matrix.
= Elliptically contoured multivariate stable distribution
=The elliptically contoured multivariate stable distribution is a special symmetric case of the multivariate stable distribution.
If X is α-stable and elliptically contoured, then it has joint characteristic function
E
exp
(
i
u
T
X
)
=
exp
{
−
(
u
T
Σ
u
)
α
/
2
+
i
u
T
δ
)
}
{\displaystyle E\exp(iu^{T}X)=\exp\{-(u^{T}\Sigma u)^{\alpha /2}+iu^{T}\delta )\}}
for some shift vector
δ
∈
R
d
{\displaystyle \delta \in R^{d}}
(equal to the mean when it exists) and some positive definite matrix
Σ
{\displaystyle \Sigma }
(akin to a correlation matrix, although the usual definition of correlation fails to be meaningful).
Note the relation to characteristic function of the multivariate normal distribution:
E
exp
(
i
u
T
X
)
=
exp
{
−
(
u
T
Σ
u
)
+
i
u
T
δ
)
}
{\displaystyle E\exp(iu^{T}X)=\exp\{-(u^{T}\Sigma u)+iu^{T}\delta )\}}
obtained when α = 2.
= Independent components
=The marginals are independent with
X
j
∼
S
(
α
,
β
j
,
γ
j
,
δ
j
)
{\displaystyle X_{j}\sim S(\alpha ,\beta _{j},\gamma _{j},\delta _{j})}
, then the
characteristic function is
E
exp
(
i
u
T
X
)
=
exp
{
−
∑
j
=
1
m
ω
(
u
j
|
α
,
β
j
)
γ
j
α
+
i
u
T
δ
)
}
{\displaystyle E\exp(iu^{T}X)=\exp \left\{-\sum _{j=1}^{m}\omega (u_{j}|\alpha ,\beta _{j})\gamma _{j}^{\alpha }+iu^{T}\delta )\right\}}
Observe that when α = 2 this reduces again to the multivariate normal; note that the iid case and the isotropic case do not coincide when α < 2.
Independent components is a special case of discrete spectral measure (see next paragraph), with the spectral measure supported by the standard unit vectors.
= Discrete
=If the spectral measure is discrete with mass
λ
j
{\displaystyle \lambda _{j}}
at
s
j
∈
S
,
j
=
1
,
…
,
m
{\displaystyle s_{j}\in \mathbb {S} ,j=1,\ldots ,m}
the characteristic function is
E
exp
(
i
u
T
X
)
=
exp
{
−
∑
j
=
1
m
ω
(
u
T
s
j
|
α
,
1
)
λ
j
α
+
i
u
T
δ
)
}
{\displaystyle E\exp(iu^{T}X)=\exp \left\{-\sum _{j=1}^{m}\omega (u^{T}s_{j}|\alpha ,1)\lambda _{j}^{\alpha }+iu^{T}\delta )\right\}}
Linear properties
If
X
∼
S
(
α
,
β
(
⋅
)
,
γ
(
⋅
)
,
δ
(
⋅
)
)
{\displaystyle X\sim S(\alpha ,\beta (\cdot ),\gamma (\cdot ),\delta (\cdot ))}
is d-dimensional, A is an m x d matrix, and
b
∈
R
m
,
{\displaystyle b\in \mathbb {R} ^{m},}
then AX + b is m-dimensional
α
{\displaystyle \alpha }
-stable with scale function
γ
(
A
T
⋅
)
,
{\displaystyle \gamma (A^{T}\cdot ),}
skewness function
β
(
A
T
⋅
)
,
{\displaystyle \beta (A^{T}\cdot ),}
and location function
δ
(
A
T
⋅
)
+
b
T
.
{\displaystyle \delta (A^{T}\cdot )+b^{T}.}
Inference in the independent component model
Recently it was shown how to compute inference in closed-form in a linear model (or equivalently a factor analysis model), involving independent component models.
More specifically, let
X
i
∼
S
(
α
,
β
x
i
,
γ
x
i
,
δ
x
i
)
,
i
=
1
,
…
,
n
{\displaystyle X_{i}\sim S(\alpha ,\beta _{x_{i}},\gamma _{x_{i}},\delta _{x_{i}}),i=1,\ldots ,n}
be a set of i.i.d. unobserved univariate drawn from a stable distribution. Given a known linear relation matrix A of size
n
×
n
{\displaystyle n\times n}
, the observation
Y
i
=
∑
i
=
1
n
A
i
j
X
j
{\displaystyle Y_{i}=\sum _{i=1}^{n}A_{ij}X_{j}}
are assumed to be distributed as a convolution of the hidden factors
X
i
{\displaystyle X_{i}}
.
Y
i
=
S
(
α
,
β
y
i
,
γ
y
i
,
δ
y
i
)
{\displaystyle Y_{i}=S(\alpha ,\beta _{y_{i}},\gamma _{y_{i}},\delta _{y_{i}})}
. The inference task is to compute the most probable
X
i
{\displaystyle X_{i}}
, given the linear relation matrix A and the observations
Y
i
{\displaystyle Y_{i}}
. This task can be computed in closed-form in O(n3).
An application for this construction is multiuser detection with stable, non-Gaussian noise.
See also
Multivariate Cauchy distribution
Multivariate normal distribution
Resources
Mark Veillette's stable distribution matlab package http://www.mathworks.com/matlabcentral/fileexchange/37514
The plots in this page where plotted using Danny Bickson's inference in linear-stable model Matlab package: https://www.cs.cmu.edu/~bickson/stable
Notes
Kata Kunci Pencarian:
- Multivariate stable distribution
- Multivariate normal distribution
- Elliptical distribution
- Stable distribution
- Joint probability distribution
- Multivariate Laplace distribution
- Cauchy distribution
- Normal distribution
- Laplace distribution
- List of probability distributions