Adaptive Huffman coding
In static Huffman coding, the probability distribution remains unchanged during the process of encoding and decoding. A source file is likely to read only once for coding purposes to avoid expensive preprocessing such as reading the entire source. An alphabet and probability distribution is often applied based on the previous experience. Such an estimated model can compromise the compression quality substantially. The amount of the loss in compression quality depends very much on how much the probability distribution of the source differs from the estimated probability distribution.
Adaptive Huffman coding algorithms improve the compression ratio by applying to the model the statistics based on the source content ...
Get Fundamental Data Compression now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.