What do all the Markov-type models have in common? All of them are essentially based on the same stochastic process, which is better known as the Markov chain. Under this section, you will find a brief review of the fundamental ideas and properties that follow the Markov chain.
Stochastic processes are frequently assumed to be independent and identically distributed (i.i.d.). Borrowing the dice analogy, this assumption rules that no matter what the past rolls were, they won't affect the likelihood (probability) of the next roll. The Markov chain won't assume i.i.d.; instead, it will define ...