Overview
This lecture covers key classes of optimization problems, including quasi-convex optimization, linear programming, quadratic programming, and geometric programming, emphasizing problem transformations, equivalence, and modern solution methods.
Quasi-Convex Optimization
- Quasi-convex functions have convex sublevel sets but may have non-global local minima.
- To solve: reduce to a sequence of convex feasibility problems using bisection on the objective value.
- Example: ratio of convex over concave functions is quasi-convex, leading to convex feasibility checks.
Linear Programming (LP)
- LP: minimize a linear function over a polyhedron defined by affine inequalities and equalities.
- LPs are widely applicable (e.g., diet problem, scheduling, resource allocation).
- Constraints can be rewritten into standard form via simple transformations.
- Piecewise linear objectives (max of affinities) can be handled via epigraph form, introducing auxiliary variables.
- Finding centers (Chebyshev center) can be formulated as LPs, despite appearances of nonlinearity.
Linear Fractional & Generalized Linear Fractional Programs
- Linear fractional programs minimize a ratio of affine functions over a polyhedron.
- These can be solved either as quasi-convex problems (using bisection + LP) or directly as LPs with proper transformation.
- Generalized versions include maximization/minimization over multiple such ratios.
Quadratic Programming (QP)
- QP: minimize a convex quadratic function over a polyhedron (linear constraints).
- Level sets are ellipsoids; QPs can be solved efficiently, even at large scales.
- Applications include least squares with constraints (e.g., isotonic regression).
- Robust versions address uncertainty in parameters (mean-variance or risk-adjusted optimization).
Second Order Cone Programming (SOCP) & Robust Optimization
- SOCP: minimize a linear function subject to linear equality and second-order (norm) constraints.
- SOCP generalizes LPs and QPs; many problems reduce to SOCP form (e.g., robust LPs, certain statistical constraints).
- Robust optimization handles data uncertainty using ellipsoidal or probabilistic (e.g., Gaussian) models, often leading to SOCP formulations.
Geometric Programming (GP)
- GP problems involve monomials (products of variables raised to real powers) and posynomials (sums of monomials).
- Not originally convex, but variable change (log-transform) converts GP to a convex problem.
- Widely used in engineering design (e.g., resource allocation, circuit and structure design).
Conic Form & Semidefinite Programming (SDP)
- Conic form problems use general cone constraints (e.g., non-negative orthant, second-order cones, positive semidefinite cones).
- SDP: minimize a linear function subject to linear equality and matrix inequality constraints (LMI).
- Many problems in control, statistics, and theoretical CS can be rephrased as SDPs.
- Max eigenvalue and induced norm minimization can be formulated as SDPs.
Key Terms & Definitions
- Quasi-Convex Function — Function with convex sublevel sets; may have local but not global minima.
- Polyhedron — Intersection of finitely many half-spaces; feasible set in LP and QP.
- Epigraph Form — Reformulation using auxiliary variables to handle max/objectives.
- Chebyshev Center — Center and largest radius of ball inscribed in a polyhedron.
- Linear Fractional Function — Ratio of two affine functions.
- Quadratic Program (QP) — Optimization with quadratic objective, linear constraints.
- Second Order Cone Program (SOCP) — Optimization with linear and second-order (norm) constraints.
- Geometric Programming (GP) — Problems with posynomial objectives/constraints, transformed to convex via log change.
- Conic Form — General optimization with constraints defined by a cone.
- Semidefinite Program (SDP) — Optimization with matrix variable constrained to be positive semidefinite (LMI).
Action Items / Next Steps
- Review class notes for each problem class and transformation technique.
- Familiarize yourself with standard problem forms for LP, QP, SOCP, GP, and SDP.
- Prepare for next lecture on duality in optimization.