Definition of Markov chain in English:

Markov chain

(also Markov model)

Pronunciation /ˈmärˌkôf/ /-ˌkôv/ /ˈmɑrkɑv/


  • A stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event.

    ‘Like the previously discussed models, Markov models have serious limitations.’
    • ‘This model represents a Markov chain in which each state is interpreted as the probability that the switch complex is in the corresponding state.’
    • ‘He applied a technique involving so-called Markov chains to calculate the required probabilities over the course of a long game with many battles.’
    • ‘He also took up new topics, writing several papers on probability theory, in particular on Markov chains.’
    • ‘Ecologists have used simple diffusion, correlated random walk, and Markov chain models to describe dispersal data for various insects.’


1930s named after Andrei A. Markov (1856–1922), Russian mathematician.