Groups
Category
Level
Focal Loss reshapes cross-entropy so that hard, misclassified examples get more focus while easy, well-classified ones are down-weighted.
Softmax turns arbitrary real-valued scores (logits) into probabilities that sum to one.
Label smoothing replaces a hard one-hot target with a slightly softened distribution to reduce model overconfidence.