10Stochastic Systems

In this chapter we turn to a discussion of stochastic systems, which are systems that are governed by random, rather than deterministic, dynamics. The word “stochastic” comes from the Greek, meaning “able to guess or conjecture.” A deterministic system is repeatable in the sense that repeated trials with the same input and initial conditions will produce the same output. A stochastic system will generally produce different outputs with each trial that can only be predicted in a probabilistic sense.

We first give an introductory treatment of discrete Markov chains. We discuss regular, ergodic, and absorbing Markov chains and give some examples and applications.

We then discuss Monte Carlo methods, which use repeated sampling from a given probability distribution to solve problems that may be deterministic. These methods are useful for systems for which deterministic methods are impractical due to the size or complexity of the problem. We will discuss Monte Carlo methods for three applications, namely, simulation, numerical integration, and optimization. Before continuing in this chapter, the reader should review Appendix C, which contains the necessary background on probability theory that is used in the remainder of the chapter.

10.1 Markov Chains

Recall the definition of a random variable from Appendix C. By a (discrete‐time) random process we mean a sequence of random variables , indexed by integers . In this section we introduce the concept of a

Get Introduction to Modeling and Simulation now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.