摘要:AbstractMutual informationI(X;Y)is a useful definition in information theory to estimate how much information the random variableYholds about the random variableX.One way to define the mutual information is by comparing the joint distribution ofXandYwith the product of the marginals through the Kullback-Leibler (KL) divergence. If the two distributions are close to each other there will be almost no leakage ofXfromYsince the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bitsYreveals aboutX.However, in the continuous case we do not have the same reasoning. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics and divergences to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.