请输入您要查询的字词:

 

单词 Markov chain
释义

Markov chain

Consider a stochastic process X1, X2, X3,…in which the state space is discrete. This is a Markov chain if the probability that Xn + 1 takes a particular value depends only on the value of Xn and not on the values of X1, X2,…, Xn − 1. (This definition can be adapted so as to apply to a stochastic process with a continuous state space, or to a more general stochastic process {X(t), t ∈ T}, to give what is called a Markov process.) Examples include random walks and problems in queuing theory.

Most Markov chains are homogeneous, so the probability that Xn + 1 = j given that Xn = i, denoted by pij, does not depend on n. In that case, if there are N states, these values pij are called the transition probabilities and form the transition matrix [pij], an N × N row stochastic matrix. See communicating class, recurrent, stationary distribution.

随便看

 

数学辞典收录了4151条数学词条,基本涵盖了常用数学知识及数学英语单词词组的翻译及用法,是数学学习的有利工具。

 

Copyright © 2000-2023 Newdu.com.com All Rights Reserved
更新时间:2025/4/29 10:05:43