Groups
Cross-entropy loss measures how well predicted probabilities match the true labels by penalizing confident wrong predictions heavily.
Elastic Net regularization combines L1 (Lasso) and L2 (Ridge) penalties to produce models that are both sparse and stable.