Markov Chains
Synonyms
Chain, Markov
Chains, Markov
Markov Chain
Markov Process
Markov Processes
Process, Markov
Processes, Markov
A stochastic process such that the conditional probability distribution for a state at any future instant, given the present state, is unaffected by any additional knowledge of the past history of the system.