Consider a continuous-time signal *x*(*t*) bandlimited in the range [−*B, + B*]. The signal is sampled at a frequency greater than or equal to the Nyquist frequency, *f _{e} =* 2

This process has continuous values. Assume that it has been quantized with a significant resolution. This random process becomes a discrete-valued random process, that is, *X*(*n*) takes values from a finite set. In information theory, *X*(*n*) is the information source, the finite set is the input alphabet, and the elements *x ^{i}* are the input symbols or letters of the alphabet. Next, we aim to compress this information. The aim of this chapter is to explain that, with certain hypotheses, carrying out this operation is possible without introducing any distortion. This is known as noiseless or lossless coding or entropy coding. Unfortunately, the allowed compression rates are generally too low. Therefore, a certain level of distortion is tolerated. We can show that a function, known as the rate-distortion function, gives a lower limit for distortion when the bit rate is set or, inversely, a lower limit for the bit rate when the distortion level is ...

Start Free Trial

No credit card required