Dictionary.com

Markov chain

or Mar·koff chain

[ mahr-kawf ]
/ ˈmɑr kɔf /
Save This Word!

noun Statistics.

a Markov process restricted to discrete random events or to discontinuous time sequences.

QUIZZES

QUIZ YOURSELF ON THE 12 TYPES OF VERB TENSES!

Loosen up your grammar muscles because it’s time to test your knowledge on verb tenses!
Question 1 of 6
The verb tenses can be split into which 3 primary categories?

Meet Grammar Coach

Write or paste your essay, email, or story into Grammar Coach and get grammar helpImprove Your Writing

Meet Grammar Coach

Improve Your Writing
Write or paste your essay, email, or story into Grammar Coach and get grammar help

Origin of Markov chain

First recorded in 1940–45; see origin at Markov process
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2021

How to use Markov chain in a sentence

British Dictionary definitions for Markov chain

Markov chain
/ (ˈmɑːkɒf) /

noun

statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it

Word Origin for Markov chain

C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
FEEDBACK