Information theory primer: entropy is used as a way of determin‐
ing just how descriptive bits are. A canonical example of entropy
would be that if it’s always sunny in Death Valley with a probability
of 100% then the entropy would be 0 to send information about
what the weather of the day was. The information doesn’t need to
be encoded since there’s nothing to report.
Another example of high entropy would be having a complex ...