or Mar·koff chain
- a Markov process restricted to discrete random events or to discontinuous time sequences.
Origin of Markov chain
First recorded in 1940–45; see origin at Markov process
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018
- statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
C20: named after Andrei Markov (1856–1922), Russian mathematician