English
Noun
Markov chain
- A discrete-time stochastic process with the Markov property.
Translations
Czech: Markovský �et�zec m
Estonian: Markovi ahel
Finnish: Markov-ketju, Markovin ketju
Polish: �a�cuch Markowa m
See also
w:Markov chain|Wikipedia article on Markov chains
Category:English eponyms
zh:Markov chain
|