Groups
Category
Deep learning generalization theory tries to explain why overparameterized networks can fit (interpolate) training data yet still perform well on new data.
Fast polynomial operations treat coefficients like numbers but use FFT/NTT to multiply in O(n \log n) time instead of O(n^2).