Appendix C
Entropy and Uncertainty
Entropy is an information-theoretic measure of the amount of uncertainty in a variable. Beginning with Shannon’s seminal works [1725–1727], cryptographers and information theorists have used entropy to determine how well transformations on messages obscure their meaning. Entropy has applications in a wide variety of disciplines, including cryptography, compression, and coding theory. This chapter reviews the basics of entropy, which has its roots in probability theory.
C.1 Conditional and Joint Probability
Definition C–1. A random variable is a variable that represents the outcome of an event.
EXAMPLE: Let X be a variable representing some random event. X is a random variable. For example, X might be the ...
Get Computer Security Art and Science, 2nd Edition now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.