单词 | Markov chain |
释义 | Markov chain Most Markov chains are homogeneous, so the probability that Xn + 1 = j given that Xn = i, denoted by pij, does not depend on n. In that case, if there are N states, these values pij are called the transition probabilities and form the transition matrix [pij], an N × N row stochastic matrix. See communicating class, recurrent, stationary distribution. |
随便看 |
|
数学辞典收录了4151条数学词条,基本涵盖了常用数学知识及数学英语单词词组的翻译及用法,是数学学习的有利工具。