Dictionary.com

Markov chain

or Mar·koff chain

[ mahr-kawf ]
/ ˈmɑr kɔf /
Save This Word!

noun Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
QUIZ
SHALL WE PLAY A "SHALL" VS. "SHOULD" CHALLENGE?
Should you take this quiz on “shall” versus “should”? It should prove to be a quick challenge!
Question 1 of 6
Which form is used to state an obligation or duty someone has?

Origin of Markov chain

First recorded in 1940–45; see origin at Markov process
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2022

How to use Markov chain in a sentence

British Dictionary definitions for Markov chain

Markov chain
/ (ˈmɑːkɒf) /

noun
statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it

Word Origin for Markov chain

C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
FEEDBACK