Chapter 6. Questions of Concurrency
I am going to start this chapter with a sentence that sounds a bit mystical: with no mutation of state, we can ignore time.
Let us unpack this. First of all, what does time have to do with mutation of state, or with programming for that matter? Time enters programming when we have a set of processes that need to be executed in a particular order.
Let’s consider the following example.1 Consider two functions. The first takes x to x + 1. The second takes x to x × x. Let’s set x to 10. Now let’s imagine running these two functions in parallel. What would the answer be? Would you be surprised that there are five possible correct answers? Let’s break this down. Let P1 = x → x × x and P2 = x → x + 1:
-
If P1 sets x to 100 and then P2 sets 100 to 101, we get 101.
-
If P2 sets x to 11 and then P1 sets 11 to 121, we get 121.
-
P2 changes 10 to 11 between the two times that P1 accesses the value of x during the evaluation of x × x. In this case, we get 110.
-
P2 accesses x. Then P1 sets x to 100. Then P2 sets x. This gives 10.
-
P1 accesses x twice. Then P2 sets x to 11. Then P1 sets x. This gives 100.
Which is the correct answer? 101, 121, 110, 10, or 100? Well, that depends on what the programmer intended. The point is that if we just let P1 and P2 run in parallel, we can’t be sure which answer will be returned. The reason is that the computer can decide to evaluate these two functions in any of the earlier orders.
Note
When state is mutated, the ...
Get Learning Functional Programming now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.