The JensenShannon divergence is a symmetric version of the KullbackLeibler divergence. Unlike the KLdivergence, the square root of the JSdivergence is a true metric obeying the triangle inequality. Interestingly, if $X$ is a random variable chosen from the average of two distributions $Q$ and $R$, then the JS divergence between $Q$ and $R$ is equal to the mutual information between $X$ and the indicator function $I$, where $P(I=1) = q(X)/( q(X) + r(X) )$, $q(x)$ is the density of $Q$, and $r(x)$ is the density of $R$. (One immediate consequence is the JS divergence is always less than 1.)
Related Posts via Categories

Funny… I was just talking to fatso about mutual information.
Comments are now closed.
2 comments