November 2018
Intermediate to advanced
1440 pages
48h 29m
English
Entropy is an information-theoretic measure of the amount of uncertainty in a variable. Beginning with Shannon’s seminal works [1725–1727], cryptographers and information theorists have used entropy to determine how well transformations on messages obscure their meaning. Entropy has applications in a wide variety of disciplines, including cryptography, compression, and coding theory. This chapter reviews the basics of entropy, which has its roots in probability theory.
Definition C–1. A random variable is a variable that represents the outcome of an event.
EXAMPLE: Let X be a variable representing some random event. X is a random variable. For example, X might be the ...