The decision cognizant Kullback–Leibler divergence is a better statistic to measure classifier (in)congruence.
Analytic and simulation studies show the new divergence is more robust to minority class clutter.
Sensitivity to estimation error is lower than that of the classical Kullback–Leibler divergence.