Groups
Category
Neural Collapse describes what happens at the end of training: the penultimate-layer features of each class concentrate tightly around a class mean.
L2 regularization (also called ridge or weight decay) adds a penalty proportional to the sum of squared weights to discourage large parameters.