Kl Divergence formalizes difference between discrete probability distributions

The Kullback-Leibler divergence (or KL-divergence for short), is a measure of (di)similarity of discrete two probability distributions. It is an extremely popular loss function when dealing with probability distributions.

Resources

Backlinks