Dictionary.com
Thesaurus.com

Markov chain

American  
[mahr-kawf] / ˈmɑr kɔf /
Or Markoff chain

noun

Statistics.
  1. a Markov process restricted to discrete random events or to discontinuous time sequences.


Markov chain British  
/ ˈmɑːkɒf /

noun

  1. statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it

"Collins English Dictionary — Complete & Unabridged" 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012

Etymology

Origin of Markov chain

First recorded in 1940–45; see origin at Markov process

Example Sentences

Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.

These rules could be decomposed into two sets that dominate at distinct length scales -- Markov chain and random nuclei.

From Science Daily • Dec. 4, 2023

My ego would like me to believe that my writing process is a little more complicated than a Markov chain.

From The Verge • May 9, 2018

Bayesian Markov chain Monte Carlo analysis was run for 100 million steps, 10% of which were removed as burn-in and sampled every 10,000 steps.

From Nature • May 12, 2015