Convex optimization

Lagrange dual problem and conjugate function

The optimization problem have two components that are objective function f0:RnR and the constraints. The objective function and constraints keep in check each other and make balance at saddle point i.e. optimal point. The dual (Lagrange) problem of the optimal problem also solve the optimization problem by making low boundary. The dual problem can be explained as a conjugate function f=sup(xTyf(x)).

Approximation

The purpose of approximation is finding optimal point x i.e. F(x)=0. We need a step/search direction Δx and step size t. Taylor approximation has polynomial arguments that is a step and parameters of derivatives at the start point. The first degree of Taylor approximation has one adding term from start point (x0,F(x0)). The adding term F(x)Δx is consistent with a parameter (gradient F(x)) and a argument (step Δx).

Taylor series

f(x)=k=0ckxk=c0+c1x+c2x2+. This is an approximation that is a function of h and derivatives of f(x) are elements of parameters. f(x±h)=f(x)±hf(x)+h22f(x)±h36f(x)+O(h4) Let’s think about sin(x). f(x)=sin(x) f(0)=0,f(x)=cos(x) f(0)=1,f(x)=sin(x) f(0)=0 Thus, \[\begin{align*} \sin(x) &= 0 + \frac{1}{1!

Duality

This is summary of Boyd convex optimization. Steepest descent method is a convex optimization algorithm. The normalized steepest descent direction xnsd is a vector of unit ball of a norm that extends in the direction f(x). The inner product of xnsd and f(x) is maximized. The first order Taylor approximation of f(x+v)=f(x)+f(x)Tv is most efficient when v=xnsd. The xnsd is unnormalized into xsd.

Strong convexity and implications

This is a summary of the Boyd convex optimization book. The strong convexity assumption can be useful to explain the iterative minimization algorithms like gradient descent, steepest descent, and Newton’s method. The smallest and largest eigen value of Hessian m2f(x)M with norm of gradient f(x)2 determine the boundary of optimal value p. The condition number of cond(Cα) Mm, where Cα is α-sublevel.

Convex set

There is a homology between a line segment and a convex set. It is helpful to understand the convex set. A line, a line segment, and one sideline has homology to an affine set, a convex set, and a cone. A line is {y|y=θ1x1+θ2x2,θ1+θ2=1} if θ1,θ2R, a line segment is if θ1,θ2>0 and an one side line if any θ1,θ2<0.