# 20.6 Exercises

Let ${X}_{1}$ and ${X}_{2}$ be two independent tosses of a fair coin. Find the entropy $H({X}_{1})$ and the joint entropy $H({X}_{1}\text{,}\text{}{X}_{2})$. Why is $H({X}_{1}\text{,}\text{}{X}_{2})=H({X}_{1})+H({X}_{2})$?

Consider an unfair coin where the two outcomes, heads and tails, have probabilities $p(heads)=p$ and $p(tails)=1-p$.

If the coin is flipped two times, what are the possible outcomes along with their respective probabilities?

Show that the entropy in part (a) is $-2p{log}_{2}(p)-2(1-p){log}_{2}(1-p)$. How could this have been predicted without calculating the probabilities in part (a)?

A random variable $X$ takes the values $1\text{,}\text{}2\text{,}\text{}\cdots \text{,}\text{}n\text{,}\text{}\cdots $ with probabilities $\frac{1}{2}\text{,}\text{}\frac{1}{{2}^{2}}\text{,}\text{}\cdots \text{,}\text{}\frac{1}{{2}^{n}}\text{,}\text{}\cdots $. Calculate the entropy $H(X)$.

Let $X$ be a random variable taking on integer values. The probability is 1/2 that $X$ is in the range

Get *Introduction to Cryptography with Coding Theory, 3rd Edition* now with the O’Reilly learning platform.

O’Reilly members experience live online training, plus books, videos, and digital content from nearly 200 publishers.