1Theory of Information: Problems 1 to 15

1.1. Problem 1 – Entropy

We consider the information transmission channel of memoryless binary symmetrical type of Figure 1.1.

Schematic illustration of the basic diagram of a digital communication.

Figure 1.1. Basic diagram of a digital communication

It is assumed that the signal-to-noise ratio leads to the following values of conditional probabilities of errors:

image

The source of binary information is considered to emit independent information with the following probabilities:

image
  1. 1) Calculate the source entropy H(X).
  2. 2) Calculate the entropy H(Y) at the receiver end.
  3. 3) Calculate the conditional entropy H(Y/X) (entropy of transmission error).
  4. 4) Calculate the loss of information in the transmission channel H(X/Y).
  5. 5) Deduce the average amount of information received by the recipient for each binary symbol sent I(X, Y) (mutual information).
  6. 6) Determine the channel capacity C and show that it is obtained when p1 = 0.5.

Solution of problem 1

  1. 1) By definition, we have:
    image

    then:

    image
  2. 2) By definition, we have:

    and:

Get Digital Communications 2 now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.