Chapter 32. Entropy and Uncertainty
Entropy is an information-theoretic measure of the amount of uncertainty in a variable. Beginning with Shannon's seminal works [908, 909], cryptographers and information theorists used entropy to determine how well transformations on messages obscure their meaning. Entropy has applications in a wide variety of disciplines, including cryptography, compression, and coding theory. This chapter reviews the basics of entropy, which has its roots in probability theory.
Conditional and Joint Probability
Definition 32–1. A random variable is a variable that represents the outcome of an event.
Get Computer Security: Art and Science now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.