# Chapter 25Stochastic Modeling

Stochastic modeling refers to a collection of advanced probability tools for studying not a single random variable, but instead, a *process* that happens randomly over time. This could be the movement of a stock price over time, visitors arriving at a web page, or a machine moving between internal states over the course of its operation.

You're not locked in to using time either; anything that is sequential can be studied. This includes which words follow which others in a piece of text, changes from one generation to the next in a line of animals, and how temperature varies across a landscape. The first place I ever used stochastic analysis was studying the sequence of nucleotides in DNA.

This chapter will give you an overview of several of the main probability models, starting with the most important one: the Markov chain. I will discuss how they are related to each other, what situations they describe, and what kinds of problems you can solve with them.

## 25.1 **Markov Chains**

By far the most important stochastic process to understand is the Markov chain. A Markov chain is a sequence of random variables *X*_{1}, *X*_{2}, … that are interpreted as the state of a system at sequential points in time. For now, assume that the *X _{i}* are discrete RVs that can take on only a finite number of values.

Each of the *X _{i}* has the same set of states that it can take on. The definitive feature of a Markov chain is that the distribution of

*X*

_{i+1}can be influenced by

*X*, but it is ...

_{i}Get *The Data Science Handbook* now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.