A histogram is a statistical graphic representation of variable distribution that allows us to understand the density estimation and probability distribution of data. A histogram is created by dividing the entire range of variable values into a small range of values, and then counting how many values fall into each interval.

If we apply this histogram concept to an image, it seems to be difficult to understand but, in fact, it is very simple. In a gray image, our variable values' ranges are each possible gray value (from `0` to `255`), and the density is the number of pixels of the image that have this value. This means that we have to count the number of pixels of the image that have a value of `0`, the number of pixels with ...