Try Our Apps


The Best Internet Slang

Markov chain

or Markoff chain

[mahr-kawf] /ˈmɑr kɔf/
noun, Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Origin of Markov chain
First recorded in 1940-45; See origin at Markov process Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2017.
Cite This Source
British Dictionary definitions for Markov chain

Markov chain

(statistics) a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin
C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition
© William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Cite This Source

Word of the Day

Difficulty index for Markov chain

Few English speakers likely know this word

Word Value for Markov

Scrabble Words With Friends

Nearby words for markov chain