Groups
Category
Level
Early stopping halts training when the validation loss stops improving, preventing overfitting and saving compute.
Grokking is when a model suddenly starts to generalize well long after it has already memorized the training set.