Information, when defined as the meaning of any given message, is essentially not possible to model, since it is totally dependent on context and the prior knowledge of the persons or systems sending or receiving the information. However, another definition of information, first developed by Claude Shannon (1949), and originally intended for use in telephone technology, is today broadly applicable as a scientific model.
The idea is simple, yet profound. The measure of information in a message is simply how unexpected the message content is, that is, the degree of uncertainty or possible variety (entropy) about the message as evident to the recipient before it is sent. After the message is received, if it is known with certainty, then the uncertainty or surprise is presumably reduced to zero. That difference in level of surprise (based on probability) determines the information transmitted from sender to receiver.
Appendix, Section “Mathematics of Information Communication” is the math to make the calculation of information transmitted as well as related quantities called input information, output information, noise, and equivocation. The reader interested in applications of these concepts to human systems is referred to Sheridan and Ferrell (1974).
Human communications are coded in various forms, such as voice, gestures and “body language,” handwritten ...