Groups
Category
Level
Cross-entropy loss measures how well predicted probabilities match the true labels by penalizing confident wrong predictions heavily.
Softmax turns arbitrary real-valued scores (logits) into probabilities that sum to one.