A parallel computer is used most effectively and with least difficulty to produce large numbers of statistically independent samples of a stochastic process. This application of parallel computing is vitally important for drawing meaningful conclusions from models that include random numbers, and is therefore very common in practice.

Amdahl’s law explains why this use of parallel computers is so attractive, and conversely why other uses pose such devilish difficulties. Suppose that we must add a large column of numbers. This is a simple but tedious task for a single person. If there are two people, each sums half the numbers and then one of them combines the two sums to get the final result: this last step cannot be done in parallel. If the column is large, there is probably enough work for two people to stay very busy, and the time needed to combine the result is relatively insignificant. If we add a third person, then each has less to do and relatively more time is spent combining their results. Adding more people further reduces the individual workload while increasing the effort to coordinate their labor.

Every job requires some work that cannot be split up, and the number of computers, human or otherwise, that can be usefully employed is therefore limited. Let Ts be the time needed finish a job with a single computer, Tp be the time to solve it with N computers, and α be the fraction of a job that can be done in parallel. These ...

Get Building Software for Simulation: Theory and Algorithms, with Applications in C++ now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.