June 2020
Intermediate to advanced
364 pages
13h 56m
English
Like the KL divergence, the Jensen-Shannon (JS) divergence measures how similar two probability distributions are; however, it is smoother. The following equation represents the JS divergence:

This behaves a lot better than KL divergence when p(x) and q(x) are both small.