Convex sets, duality, and KKT conditions — theoretical foundations for understanding optimization guarantees.
7 concepts
A set is convex if every line segment between any two of its points lies entirely inside the set.
A convex optimization problem minimizes a convex function over a convex set, guaranteeing that every local minimum is a global minimum.
Lagrangian duality turns a constrained minimization problem into a related maximization problem that provides lower bounds on the original objective.
A proximal operator pulls a point x toward minimizing a function f while penalizing how far it moves, acting like a denoiser or projector depending on f.
KKT conditions generalize Lagrange multipliers to handle inequality constraints in constrained optimization problems.
ADMM splits a hard optimization problem into two easier subproblems that communicate through simple averaging-like steps.