Advertisement

Advertisement

Markov process

Or Markoff process

noun

Statistics.
  1. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.



Discover More

Word History and Origins

Origin of Markov process1

1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it
Discover More

Example Sentences

Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.

“A Markov process is where you have a sequence of numbers or letters or notes, and the probability of any particular note depends only on the few notes that have come before,” said Kershenbaum.

Read more on Washington Post

A “Markov process” or “Markov chain” is a sequence of random states in which the probability of what comes at the next time step depends only on the current state and not on anything earlier.

Read more on Scientific American

Advertisement

Advertisement

Advertisement

Advertisement


Markov chainMarkowitz