Dictionary.com
definitions
  • synonyms

Markov process

or Markoff process

See more synonyms on Thesaurus.com
noun Statistics.
  1. a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.

Origin of Markov process

1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it
Dictionary.com Unabridged Based on the Random House Unabridged Dictionary, © Random House, Inc. 2018