Dictionary.com
definitions
  • synonyms

Markov chain

or Mar·koff chain

[mahr-kawf]
noun Statistics.
  1. a Markov process restricted to discrete random events or to discontinuous time sequences.
Show More

Origin of Markov chain

First recorded in 1940–45; see origin at Markov process
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018

British Dictionary definitions for markov chain

Markov chain

noun
  1. statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Show More

Word Origin

C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012