Groups
Focal Loss reshapes cross-entropy so that hard, misclassified examples get more focus while easy, well-classified ones are down-weighted.
Cross-entropy loss measures how well predicted probabilities match the true labels by penalizing confident wrong predictions heavily.