Chapter 7Discrete-Time Markov Chains

7.1 Introduction

A Markov process is a stochastic process whose dynamic behavior is such that probability distributions for its future development depend only on the present state and not on how the process arrived in that state. If we assume that the state space, I, is discrete (finite or countably infinite), then the Markov process is known as a Markov chain. If we further assume that the parameter space, T, is also discrete, then we have a discrete-time Markov chain (DTMC). Such processes are the subject of this chapter. Since the parameter space is discrete, we will let c07-math-0003 without loss of generality.

We choose to observe the state of a system at a discrete set of time points. The successive observations define the random variables X0, X1, X2, c07-math-0007, Xn, c07-math-0009, at time steps c07-math-0010, respectively. If c07-math-0011, then the state of the system at time step n is j. X0 is the initial state of the system. The Markov property can then be succinctly stated as

Intuitively, equation ...

Get Probability and Statistics with Reliability, Queuing, and Computer Science Applications, 2nd Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.