Groups
Category
Adam is an optimization algorithm that combines momentum (first moment) with RMSProp-style adaptive learning rates (second moment).
Optimization theory studies how to choose variables to minimize or maximize an objective while respecting constraints.