<<

A Brief Overview ! Why optimization?" of Optimization Problems" •! In some sense, all engineering design is optimization: choosing design parameters to improve some objective" •! Much of data analysis is also optimization: extracting some model parameters from data while Steven G. Johnson" minimizing some error (e.g. fitting)" MIT course 18.335, Fall 2008" •! Most business decisions = optimization: varying some decision parameters to maximize profit (e.g. investment portfolios, supply chains, etc.)"

A general optimization problem" Important considerations"

•! Global versus local optimization" minimize an objective function f0" min f0 (x) with respect to n design parameters x! •! Convex vs. non-convex optimization" x!!n (also called decision parameters, optimization variables, etc.)" •! Unconstrained or box-constrained optimization, and — note that maximizing g(x)" other special-case constraints" corresponds to f (x) = –g(x)" subject to m constraints" 0 •! Special classes of functions (linear, etc.)" f (x) ! 0 note that an equality " •! Differentiable vs. non-differentiable functions" i h(x) = 0" •! Gradient-based vs. derivative-free " i = 1,2,…,m yields two inequality constraints" •! …" fi(x) = h(x) and fi+1(x) = –h(x)" (although, in practical algorithms, equality constraints •! Zillions of different algorithms, usually restricted to "typically require special handling)" x is a feasible point if it" various special cases, each with strengths/weaknesses" satisfies all the constraints" = of all feasible x! Convex Optimization" Global vs. Local Optimization" [ good reference: Convex Optimization by Boyd and Vandenberghe," free online at www.stanford.edu/~boyd/cvxbook ]" •! For general nonlinear functions, most algorithms only guarantee a local optimum" All the functions fi (i=0…m) are convex:" ! + " = 1 –! that is, a feasible xo such that f0(xo) # f0(x) for all feasible x fi (!x + "y) # ! fi (x) + " fi (y) where" within some neighborhood ||x–xo|| < R (for some small R)" !," #[0,1] •! A much harder problem is to find a global optimum: the convex:" not convex:" minimum of f0 for all feasible x" f(x)" f(x)" –! exponentially increasing difficulty with increasing n, practically "f(y)" impossible to guarantee that you have found global minimum !f(x) + without knowing some special property of f 0" f(!x+"y)" –! many available algorithms, problem-dependent efficiencies" x! y! x! y! •! not just genetic algorithms or simulated annealing (which are popular, easy to implement, and thought-provoking, but usually very slow!)" For a convex problem (convex objective & constraints)" •! for example, non-random systematic search algorithms (e.g. DIRECT), any local optimum must be a global optimum" partially randomized searches (e.g. CRS2), repeated local searches from different starting points (“multistart” algorithms, e.g. MLSL), …" "! efficient, robust solution methods available"

Important special constraints" Important Convex Problems" •! Simplest case is the unconstrained optimization problem: m=0" •! LP (): the objective and –! e.g., line-search methods like steepest-descent, T nonlinear conjugate gradients, Newton methods …" constraints are affine: fi(x) = ai x + !i! •! QP (): affine constraints + •! Next-simplest are box constraints (also called bound constraints): x min # x # x max" convexquadratic objective xTAx+bTx" k k k –! easily incorporated into line-search methods and many •! SOCP (second-order cone program): LP + cone other algorithms" T constraints ||Ax+b||2 # a x + !" –! many algorithms/software only handle box constraints" •! SDP (semidefinite programming): constraints are that •! …"

"Akxk is positive-semidefinite" •! Linear equality constraints Ax=b" –! for example, can be explicitly eliminated from the all of these have very efficient, specialized solution methods" problem by writing x=Ny+#, where # is a solution to A#=b and N is a basis for the nullspace of A! Derivatives of fi" Removable non-differentiability"

•! Most-efficient algorithms typically require user to consider the non-differentiable unconstrained problem:" supply the gradients #xfi of objective/constraints " –! you should always compute these analytically" min f (x) •! rather than use finite-difference approximations, better to just n 0 f0(x)" use a derivative-free optimization " x!! –f (x)" 0 •! in principle, one can always compute #xfi with about the same optimum" cost as fi, using adjoint methods" equivalent to minimax problem:" –! gradient-based methods can find (local) optima of min max f (x)," f (x) x! n ( { 0 0 }) problems with millions of design parameters" x!! …still nondifferentiable…" •! Derivative-free methods: only require fi values" –! easier to use, can work with complicated “black-box” …equivalent to constrained problem with a “temporary” t:" functions where computing gradients is inconvenient" t ! f (x) ( f1(x) = f0 (x) ! t) –! may be only possibility for nondifferentiable problems" min t subject to:" 0 n t ! " f (x) f2 (x) = ! f0 (x) ! t –! need > n function evaluations, bad for large n" x!! , t !! 0 ( )

Example: Chebyshev linear fitting" Relaxations of Programming" fit line" b! If x is integer-valued rather than real-valued (e.g. x {0,1}n)," find the fit that minimizes" ax1+x2" $ the maximum error:" the resulting or combinatorial optimization" problem becomes much harder in general."

min max x1ai + x2 ! bi N points" x1 ,x2 ( i ) However, useful results can often be obtained by a continuous! (a ,b )! i i relaxation of the problem — e.g., going from x $ {0,1}n to x $ [0,1]n" … nondifferentiable minimax problem" ! a! … at the very least, this gives an lower bound on the optimum f0 equivalent to a linear programming problem (LP):" min t subject to 2N constraints:" x1 ,x2 , t x1ai + x2 ! bi ! t " 0

bi ! x1ai ! x2 ! t " 0 Example: Topology Optimization" design a structure to do something, made of material A or B…" Some Sources of Software" let every pixel of discretized structure vary continuously from A to B"

density of each pixel" • Decision tree for optimization software:" varies continuously from 0 (air) to max" force" "http://plato.asu.edu/guide.html" — lists many packages for many problems"

• CVX: general convex-optimization package" "http://www.stanford.edu/~boyd/cvx"

ex: design a cantilever" optimized structure," • NLopt: implements many nonlinear optimization algorithms " to support maximum weight" deformed under load" (global/local, constrained/unconstrained, derivative/no-derivative)" with a fixed amount of material" "http://ab-initio.mit.edu/nlopt" [ Buhl et al, Struct. Multidisc. Optim. 19, 93–104 (2000) ]"