Groups
Category
A convex optimization problem minimizes a convex function over a convex set, guaranteeing that every local minimum is a global minimum.
Gradient descent updates parameters by stepping opposite the gradient: x_{t+1} = x_t - \eta \nabla f(x_t).