# Chapter 7Discrete-Time Markov Chains

## 7.1 Introduction

A **Markov process** is a stochastic process whose dynamic behavior is such that probability distributions for its future development depend only on the present state and not on how the process arrived in that state. If we assume that the state space, *I*, is discrete (finite or countably infinite), then the Markov process is known as a **Markov chain**. If we further assume that the parameter space, *T*, is also discrete, then we have a **discrete-time Markov chain** (DTMC). Such processes are the subject of this chapter. Since the parameter space is discrete, we will let without loss of generality.

We choose to observe the state of a system at a discrete set of time points. The successive observations define the random variables *X*_{0}, *X*_{1}, *X*_{2}, , *X _{n}*, , at time steps , respectively. If , then the state of the system at time step

*n*is

*j*.

*X*

_{0}is the initial state of the system. The Markov property can then be succinctly stated as

Intuitively, equation ...

Get *Probability and Statistics with Reliability, Queuing, and Computer Science Applications, 2nd Edition* now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.