# 2Markov Chains

This chapter focuses on Markov chains by summarizing the key concepts and results. In 1907, A.A. Markov introduced a new type of stochastic process wherein the outcome of an experiment may have an impact on the outcome of the next experiment. Now Markov chains are used extensively in many areas of stochastic modeling.

In practice, the current state of a physical system can be specified by giving the values of a number of variables that describe the system. Some examples are: (1) The state of a chemical system can be specified by giving the values of pressure, volume and temperature. (2) The state of a machine can be described by specifying whether it is active or inactive. (3) The state of a reservoir is specified in terms of its water level.

A physical system undergoes changes over a period of time. Again with regard to the examples listed above, the pressure may change over time in a chemical system; the water level may increase or decrease or remain the same from one day to another. Such changes are referred to as transitions. Suppose the states that a system can occupy at any given time are known only probabilistically. Furthermore, suppose that the probability structure is such that the future state of the system depends only on the current state and not on the past. Then, such a system has a very nice mathematical structure that enables one to study the system in more depth. The study of Markov processes is very worthwhile as they arise naturally in stochastic ...

Get *Introduction to Matrix Analytic Methods in Queues 1* now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.