请输入您要查询的字词:

 

单词 MarkovChain
释义

Markov chain


Definition

We begin with a probability spaceMathworldPlanetmath (Ω,,). Let I be a countable set, (Xn:n) be a collectionMathworldPlanetmath of random variablesMathworldPlanetmath taking values in I, 𝐓=(tij:i,jI) be a stochastic matrix, and λ be a distributionPlanetmathPlanetmath. We call (Xn)n0 a Markov chainMathworldPlanetmath with initial distribution λ and transition matrix 𝐓 if:

  1. 1.

    X0 has distribution λ.

  2. 2.

    For n0, (Xn+1=in+1|X0=i0,,Xn=in)=tinin+1.

That is, the next value of the chain depends only on the current value, not any previous values. This is often summed up in the pithy phrase, “Markov chains have no memory.”

As a special case of (2) we have that (Xn+1=i|Xn=j)=tij whenever i,jI. The values tij are therefore calledtransition probabilities for the Markov chain.

Discussion

Markov chains are arguably the simplest examples of random processes. They come in discrete and continuous versions; the discrete version is presented above.

TitleMarkov chain
Canonical nameMarkovChain
Date of creation2013-03-22 12:37:32
Last modified on2013-03-22 12:37:32
OwnerMathprof (13753)
Last modified byMathprof (13753)
Numerical id5
AuthorMathprof (13753)
Entry typeDefinition
Classificationmsc 60J10
Related topicHittingTime
Related topicMarkovChainsClassStructure
Related topicMemorylessRandomVariable
Related topicLeslieModel
DefinesMarkov chain
Definestransition matrix
Definestransition probability
随便看

 

数学辞典收录了18232条数学词条,基本涵盖了常用数学知识及数学英语单词词组的翻译及用法,是数学学习的有利工具。

 

Copyright © 2000-2023 Newdu.com.com All Rights Reserved
更新时间:2025/5/5 10:24:33