- Source: Transfer entropy
Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process X to another process Y is the amount of uncertainty reduced in future values of Y by knowing the past values of X given past values of Y. More specifically, if
X
t
{\displaystyle X_{t}}
and
Y
t
{\displaystyle Y_{t}}
for
t
∈
N
{\displaystyle t\in \mathbb {N} }
denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as:
T
X
→
Y
=
H
(
Y
t
∣
Y
t
−
1
:
t
−
L
)
−
H
(
Y
t
∣
Y
t
−
1
:
t
−
L
,
X
t
−
1
:
t
−
L
)
,
{\displaystyle T_{X\rightarrow Y}=H\left(Y_{t}\mid Y_{t-1:t-L}\right)-H\left(Y_{t}\mid Y_{t-1:t-L},X_{t-1:t-L}\right),}
where H(X) is Shannon's entropy of X. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.
Transfer entropy is conditional mutual information, with the history of the influenced variable
Y
t
−
1
:
t
−
L
{\displaystyle Y_{t-1:t-L}}
in the condition:
T
X
→
Y
=
I
(
Y
t
;
X
t
−
1
:
t
−
L
∣
Y
t
−
1
:
t
−
L
)
.
{\displaystyle T_{X\rightarrow Y}=I(Y_{t};X_{t-1:t-L}\mid Y_{t-1:t-L}).}
Transfer entropy reduces to Granger causality for vector auto-regressive processes. Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals. However, it usually requires more samples for accurate estimation.
The probabilities in the entropy formula can be estimated using different approaches (binning, nearest neighbors) or, in order to reduce complexity, using a non-uniform embedding.
While it was originally defined for bivariate analysis, transfer entropy has been extended to multivariate forms, either conditioning on other potential source variables or considering transfer from a collection of sources, although these forms require more samples again.
Transfer entropy has been used for estimation of functional connectivity of neurons, social influence in social networks and statistical causality between armed conflict events.
Transfer entropy is a finite version of the directed information which was defined in 1990 by James Massey as
I
(
X
n
→
Y
n
)
=
∑
i
=
1
n
I
(
X
i
;
Y
i
|
Y
i
−
1
)
{\displaystyle I(X^{n}\to Y^{n})=\sum _{i=1}^{n}I(X^{i};Y_{i}|Y^{i-1})}
, where
X
n
{\displaystyle X^{n}}
denotes the vector
X
1
,
X
2
,
.
.
.
,
X
n
{\displaystyle X_{1},X_{2},...,X_{n}}
and
Y
n
{\displaystyle Y^{n}}
denotes
Y
1
,
Y
2
,
.
.
.
,
Y
n
{\displaystyle Y_{1},Y_{2},...,Y_{n}}
. The directed information places an important role in characterizing the fundamental limits (channel capacity) of communication channels with or without feedback and gambling with causal side information.
See also
Directed information
Mutual information
Conditional mutual information
Causality
Causality (physics)
Structural equation modeling
Rubin causal model
References
External links
"Transfer Entropy Toolbox". Google Code., a toolbox, developed in C++ and MATLAB, for computation of transfer entropy between spike trains.
"Java Information Dynamics Toolkit (JIDT)". GitHub. 2019-01-16., a toolbox, developed in Java and usable in MATLAB, GNU Octave and Python, for computation of transfer entropy and related information-theoretic measures in both discrete and continuous-valued data.
"Multivariate Transfer Entropy (MuTE) toolbox". GitHub. 2019-01-09., a toolbox, developed in MATLAB, for computation of transfer entropy with different estimators.
Kata Kunci Pencarian:
- Entropi
- Kematian panas alam semesta
- Stephen Hawking
- Termodinamika kuantum
- Metabolisme
- Transfer entropy
- Entropy
- Second law of thermodynamics
- Entropy as an arrow of time
- Directed information
- Introduction to entropy
- Entropy (classical thermodynamics)
- Laws of thermodynamics
- Maxwell's demon
- Rényi entropy