Chapter 25Stochastic Modeling
Stochastic modeling refers to a collection of advanced probability tools for studying not a single random variable, but instead, a process that happens randomly over time. This could be the movement of a stock price over time, visitors arriving at a web page, or a machine moving between internal states over the course of its operation.
You're not locked in to using time either; anything that is sequential can be studied. This includes which words follow which others in a piece of text, changes from one generation to the next in a line of animals, and how temperature varies across a landscape. The first place I ever used stochastic analysis was studying the sequence of nucleotides in DNA.
This chapter will give you an overview of several of the main probability models, starting with the most important one: the Markov chain. I will discuss how they are related to each other, what situations they describe, and what kinds of problems you can solve with them.
25.1 Markov Chains
By far the most important stochastic process to understand is the Markov chain. A Markov chain is a sequence of random variables X1, X2, … that are interpreted as the state of a system at sequential points in time. For now, assume that the Xi are discrete RVs that can take on only a finite number of values.
Each of the Xi has the same set of states that it can take on. The definitive feature of a Markov chain is that the distribution of Xi+1 can be influenced by Xi, but it is ...
Get The Data Science Handbook now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.