Chapter 32. Entropy and Uncertainty

Entropy is an information-theoretic measure of the amount of uncertainty in a variable. Beginning with Shannon's seminal works [908, 909], cryptographers and information theorists used entropy to determine how well transformations on messages obscure their meaning. Entropy has applications in a wide variety of disciplines, including cryptography, compression, and coding theory. This chapter reviews the basics of entropy, which has its roots in probability theory.

Conditional and Joint Probability

  • Definition 32–1. A random variable is a variable that represents the outcome of an event.

Get Computer Security: Art and Science now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.