- Source: Graphical lasso
In statistics, the graphical lasso is a sparse penalized maximum likelihood estimator for the concentration or precision matrix (inverse of covariance matrix) of a multivariate elliptical distribution. The original variant was formulated to solve Dempster's covariance selection problem for the multivariate Gaussian distribution when observations were limited. Subsequently, the optimization algorithms to solve this problem were improved and extended to other types of estimators and distributions.
Setting
Consider observations
X
1
,
X
2
,
…
,
X
n
{\displaystyle X_{1},X_{2},\ldots ,X_{n}}
from multivariate Gaussian distribution
X
∼
N
(
0
,
Σ
)
{\displaystyle X\sim N(0,\Sigma )}
. We are interested in estimating the precision matrix
Θ
=
Σ
−
1
{\displaystyle \Theta =\Sigma ^{-1}}
.
The graphical lasso estimator is the
Θ
^
{\displaystyle {\hat {\Theta }}}
such that:
Θ
^
=
argmin
Θ
≥
0
(
tr
(
S
Θ
)
−
log
det
(
Θ
)
+
λ
∑
j
≠
k
|
Θ
j
k
|
)
{\displaystyle {\hat {\Theta }}=\operatorname {argmin} _{\Theta \geq 0}\left(\operatorname {tr} (S\Theta )-\log \det(\Theta )+\lambda \sum _{j\neq k}|\Theta _{jk}|\right)}
where
S
{\displaystyle S}
is the sample covariance, and
λ
{\displaystyle \lambda }
is the penalizing parameter.
Application
To obtain the estimator in programs, users could use the R package glasso, GraphicalLasso() class in the scikit-learn Python library, or the skggm Python package (similar to scikit-learn).
See also
Graphical model
Lasso (statistics)
References
Kata Kunci Pencarian:
- Daftar istilah komputer
- Graphical lasso
- Lasso (statistics)
- Process Lasso
- Bill Atkinson
- Least squares
- Data and information visualization
- Mixed-data sampling
- Feature selection
- Susan Kare
- Rotaxane