Groups
Sharpness-Aware Minimization (SAM) trains models to perform well even when their weights are slightly perturbed, seeking flatter minima that generalize better.
A loss landscape is the โterrainโ of a modelโs loss as you move through parameter space; valleys are good solutions and peaks are bad ones.