Dictionary.com
Thesaurus.com

Markov process

American  
Or Markoff process

noun

Statistics.
  1. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.


Etymology

Origin of Markov process

1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it

Example Sentences

Examples are provided to illustrate real-world usage of words in context. Any opinions expressed do not reflect the views of Dictionary.com.

“A Markov process is where you have a sequence of numbers or letters or notes, and the probability of any particular note depends only on the few notes that have come before,” said Kershenbaum.

From Washington Post

A “Markov process” or “Markov chain” is a sequence of random states in which the probability of what comes at the next time step depends only on the current state and not on anything earlier.

From Scientific American