• Source: Additive Markov chain
  • In probability theory, an additive Markov chain is a Markov chain with an additive conditional probability function. Here the process is a discrete-time Markov chain of order m and the transition probability to a state at the next time is a sum of functions, each depending on the next state and one of the m previous states.


    Definition


    An additive Markov chain of order m is a sequence of random variables X1, X2, X3, ..., possessing the following property: the probability that a random variable Xn has a certain value xn under the condition that the values of all previous variables are fixed depends on the values of m previous variables only (Markov chain of order m), and the influence of previous variables on a generated one is additive,




    Pr
    (

    X

    n


    =

    x

    n




    X

    n

    1


    =

    x

    n

    1


    ,

    X

    n

    2


    =

    x

    n

    2


    ,

    ,

    X

    n

    m


    =

    x

    n

    m


    )
    =



    r
    =
    1


    m


    f
    (

    x

    n


    ,

    x

    n

    r


    ,
    r
    )
    .


    {\displaystyle \Pr(X_{n}=x_{n}\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots ,X_{n-m}=x_{n-m})=\sum _{r=1}^{m}f(x_{n},x_{n-r},r).}



    Binary case


    A binary additive Markov chain is where the state space of the chain consists on two values only, Xn ∈ { x1, x2 }. For example, Xn ∈ { 0, 1 }. The conditional probability function of a binary additive Markov chain can be represented as




    Pr
    (

    X

    n


    =
    1


    X

    n

    1


    =

    x

    n

    1


    ,

    X

    n

    2


    =

    x

    n

    2


    ,

    )
    =



    X
    ¯



    +



    r
    =
    1


    m


    F
    (
    r
    )
    (

    x

    n

    r






    X
    ¯



    )
    ,


    {\displaystyle \Pr(X_{n}=1\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots )={\bar {X}}+\sum _{r=1}^{m}F(r)(x_{n-r}-{\bar {X}}),}





    Pr
    (

    X

    n


    =
    0


    X

    n

    1


    =

    x

    n

    1


    ,

    X

    n

    2


    =

    x

    n

    2


    ,

    )
    =
    1

    Pr
    (

    X

    n


    =
    1


    X

    n

    1


    =

    x

    n

    1


    ,

    X

    n

    2


    =

    x

    n

    2


    ,

    )
    .


    {\displaystyle \Pr(X_{n}=0\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots )=1-\Pr(X_{n}=1\mid X_{n-1}=x_{n-1},X_{n-2}=x_{n-2},\dots ).}


    Here






    X
    ¯





    {\displaystyle {\bar {X}}}

    is the probability to find Xn = 1 in the sequence and
    F(r) is referred to as the memory function. The value of






    X
    ¯





    {\displaystyle {\bar {X}}}

    and the function F(r) contain all the information about correlation properties of the Markov chain.


    = Relation between the memory function and the correlation function

    =
    In the binary case, the correlation function between the variables




    X

    n




    {\displaystyle X_{n}}

    and




    X

    k




    {\displaystyle X_{k}}

    of the chain depends on the distance



    n

    k


    {\displaystyle n-k}

    only. It is defined as follows:




    K
    (
    r
    )
    =

    (

    X

    n






    X
    ¯



    )
    (

    X

    n
    +
    r






    X
    ¯



    )

    =


    X

    n



    X

    n
    +
    r








    X
    ¯




    2


    ,


    {\displaystyle K(r)=\langle (X_{n}-{\bar {X}})(X_{n+r}-{\bar {X}})\rangle =\langle X_{n}X_{n+r}\rangle -{\bar {X}}^{2},}


    where the symbol








    {\displaystyle \langle \cdots \rangle }

    denotes averaging over all n. By definition,




    K
    (

    r
    )
    =
    K
    (
    r
    )
    ,
    K
    (
    0
    )
    =



    X
    ¯



    (
    1




    X
    ¯



    )
    .


    {\displaystyle K(-r)=K(r),K(0)={\bar {X}}(1-{\bar {X}}).}


    There is a relation between the memory function and the correlation function of the binary additive Markov chain:




    K
    (
    r
    )
    =



    s
    =
    1


    m


    K
    (
    r

    s
    )
    F
    (
    s
    )
    ,




    r
    =
    1
    ,
    2
    ,


    .


    {\displaystyle K(r)=\sum _{s=1}^{m}K(r-s)F(s),\,\,\,\,r=1,2,\dots \,.}



    See also


    Examples of Markov chains


    Notes




    References


    A.A. Markov. (1906) "Rasprostranenie zakona bol'shih chisel na velichiny, zavisyaschie drug ot druga". Izvestiya Fiziko-matematicheskogo obschestva pri Kazanskom universitete, 2-ya seriya, tom 15, 135–156
    A.A. Markov. (1971) "Extension of the limit theorems of probability theory to a sum of variables connected in a chain". reprinted in Appendix B of: R. Howard. Dynamic Probabilistic Systems, volume 1: Markov Chains. John Wiley and Sons
    S. Hod; U. Keshet (2004). "Phase transition in random walks with long-range correlations". Phys. Rev. E. 70 (1 Pt 2): 015104. arXiv:cond-mat/0311483. Bibcode:2004PhRvE..70a5104H. doi:10.1103/PhysRevE.70.015104. PMID 15324113. S2CID 18169687.
    S.L. Narasimhan; J.A. Nathan; K.P.N. Murthy (2005). "Can coarse-graining introduce long-range correlations in a symbolic sequence?". Europhys. Lett. 69 (1): 22. arXiv:cond-mat/0409042. Bibcode:2005EL.....69...22N. doi:10.1209/epl/i2004-10307-2. S2CID 250845691.
    Ramakrishnan, S. (1981) "Finitely Additive Markov Chains", Transactions of the American Mathematical Society, 265 (1), 247–272 JSTOR 1998493

Kata Kunci Pencarian: