Discrete-Time Markov Chains

4.1 Introduction

The discrete-time process image is called a Markov chain if for all image, the following is true:

image (4.1)

The quantity image is called the state-transition probability, which is the conditional probability that the process will be in state j at time k immediately after the next transition, given that it is in state i at time ...

Get Markov Processes for Stochastic Modeling, 2nd Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.