Advertisement

Advertisement

Markov chain

or Mar·koff chain

[ mahr-kawf ]

noun

, Statistics.
  1. a Markov process restricted to discrete random events or to discontinuous time sequences.


Markov chain

/ ˈmɑːkɒf /

noun

  1. statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it


Discover More

Word History and Origins

Origin of Markov chain1

First recorded in 1940–45; Markov process

Discover More

Word History and Origins

Origin of Markov chain1

C20: named after Andrei Markov (1856–1922), Russian mathematician

Advertisement

Word of the Day

tortuous

[tawr-choo-uhs ]

Meaning and examples

Start each day with the Word of the Day in your inbox!

By clicking "Sign Up", you are accepting Dictionary.com Terms & Conditions and Privacy Policies.

Advertisement

Advertisement

Advertisement


MarkovaMarkov process