Advertisement

Advertisement

Markov chain

or Mar·koff chain

[ mahr-kawf ]

noun

, Statistics.
  1. a Markov process restricted to discrete random events or to discontinuous time sequences.


Markov chain

/ ˈmɑːkɒf /

noun

  1. statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
“Collins English Dictionary — Complete & Unabridged” 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012


Discover More

Word History and Origins

Origin of Markov chain1

First recorded in 1940–45; Markov process
Discover More

Word History and Origins

Origin of Markov chain1

C20: named after Andrei Markov (1856–1922), Russian mathematician

Advertisement

Advertisement

Advertisement

Advertisement


MarkovaMarkov process