Groups
Cross-entropy measures how well a proposed distribution Q predicts outcomes actually generated by a true distribution P.
Shannon entropy quantifies the average uncertainty or information content of a random variable in bits when using base-2 logarithms.