Groups
Category
Gradient descent is a simple, repeatable way to move downhill on a loss surface by stepping in the opposite direction of the gradient.
Convex optimization studies minimizing convex functions over convex sets, where every local minimum is guaranteed to be a global minimum.