20.6 Exercises

  1. Let X1 and X2 be two independent tosses of a fair coin. Find the entropy H(X1) and the joint entropy H(X1, X2). Why is H(X1, X2)=H(X1)+H(X2)?

  2. Consider an unfair coin where the two outcomes, heads and tails, have probabilities p(heads)=p and p(tails)=1p.

    1. If the coin is flipped two times, what are the possible outcomes along with their respective probabilities?

    2. Show that the entropy in part (a) is 2plog2(p)2(1p)log2(1p). How could this have been predicted without calculating the probabilities in part (a)?

  3. A random variable X takes the values 1, 2, , n,  with probabilities 12, 122, , 12n, . Calculate the entropy H(X).

  4. Let X be a random variable taking on integer values. The probability is 1/2 that X is in the range

Get Introduction to Cryptography with Coding Theory, 3rd Edition now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.