2Markov Chains
If the present is born of the past, then the present is pregnant with the future.
Voltaire (1694–1778)
In this chapter, we review Markov1 chains for discrete periods of time and Markov processes for continuous time. The previous chapter was limited to a series of independent events, but this time we observe the evolution of a system over time, supposedly correlated, satisfying the Markov property: its state at time n + 1 only depends on its state at time n. Given its present state, the future state of the system does not depend on its past states.
2.1. Markov chains in discrete time
2.1.1. Definitions
2.1.1.1. Random process
DEFINITION 2.1.– A random process is the representation of the evolution of a random variable over a period of time.
In other words, a random process is a series of random variables X(t) ∈ Х indexed by time t ∈ Ƭ. For a countable set Ƭ, we say that the process is discrete.
The state X(t) of a dynamic system evolving over a period of time in a random manner can be modeled by a random variable. This system is, informally, described by a random process.
DEFINITION 2.2.– Informally, a random process is a dynamic system whose state changes over a period of time in a random manner.
From now on, we consider χ = {0, …, n} ⊂ ℕ and Ƭ = ℕ, that is, the possible state of this system is an integer between 0 and n and that time t is a positive integer or zero.
We denote by πi(t) the probability that the system will be in state X(t) = i at instant ...
Get Queues Applied to Telecoms now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.