*** Welcome to piglix ***

Markov chain


In probability theory and related fields, a Markov process (or Markoff process), named after the Russian mathematician Andrey Markov, is a  that satisfies the Markov property (sometimes characterized as "memorylessness"). Loosely speaking, a process satisfies the Markov property if one can make predictions for the future of the process based solely on its present state just as well as one could knowing the process's full history; i.e., conditional on the present state of the system, its future and past states are independent.

A Markov chain is a type of Markov process that has either discrete state space or discrete index set (often representing time), but the precise definition of a Markov chain varies. For example, it is common to define a Markov chain as a Markov process in either discrete or continuous time with a countable state space (thus regardless of the nature of time), but it is also common to define a Markov chain as having discrete time in either countable or continuous state space (thus regardless of the state space).

Andrey Markov studied Markov processes in the early 20th century, publishing his first paper on the topic in 1906, but earlier uses of Markov processes already existed.Random walks on the integers and the Gambler's ruin problem are examples of Markov processes and were studied hundreds of years earlier. Two important examples of Markov processes are the Wiener process, also known as the Brownian motion process, and the Poisson process, which are considered the most important and central stochastic processes in the theory of stochastic processes, and were discovered repeatedly and independently, both before and after 1906, in various settings. These two processes are Markov processes in continuous time, while random walks on the integers and the Gambler's ruin problem are examples of Markov processes in discrete time.


...
Wikipedia

...