Concepts2
📚TheoryIntermediate
Optimization Theory
Optimization theory studies how to choose variables to minimize or maximize an objective while respecting constraints.
#optimization#convex optimization#gradient descent+12
📚TheoryIntermediate
Gradient Descent Convergence Theory
Gradient descent updates parameters by stepping opposite the gradient: x_{t+1} = x_t - \eta \nabla f(x_t).
#gradient descent#convergence rate#l-smooth+12