Groups
KL divergence measures how much information is lost when using model Q to approximate the true distribution P.
KullbackโLeibler (KL) divergence measures how one probability distribution P devotes probability mass differently from a reference distribution Q.