12.7 Discrete-Time Markov Chains

The discrete-time process {Xkk = 0, 1, 2, …} is called a Markov chain if for all ijk, …, m, the following is true:

PXk=j|Xk1=i,Xk2=α,,X0=θ=PXk=j|Xk1=i=pijk

si100_e

The quantity pijk is called the state transition probability, which is the conditional probability that the process will be in state j at time k immediately after the next transition, given that it is in state i at time k − 1. A Markov chain that obeys the preceding rule is called a non-homogeneous Markov chain. In this book we will consider only homogeneous Markov chains, which are Markov chains in which pijk = pij. This means that homogeneous Markov ...

Get Fundamentals of Applied Probability and Random Processes, 2nd Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.