Groups
Category
A convex optimization problem minimizes a convex function over a convex set, guaranteeing that every local minimum is a global minimum.
Gradient descent is a simple, repeatable way to move downhill on a loss surface by stepping in the opposite direction of the gradient.