There are three kinds of -divergences: the generic version, the Renyi -divergence and the Tsallis -divergence.

If and are the densities of and respectively, the generic -divergence is defined as

for any . This is a special case of an f-divergence, thus admits the same variational inequality. This also means it’s a convex distributional distance. It convergences to the KL divergence as , the reverse KL as and to the Hellinger distance at . For it’s the chi-squared divergence.

Renyi -divergence

The Renyi -divergence is

This obeys: , , , and this converges to the KL divergence. The Renyi divergence admits the variational inequality

for all measurable and , .

Tsallis -divergence

The Tsallis divergence is

Like the Renyi-divergence, this obeys , , .