
A Framework for Integrating Exact and Heuristic Optimization Methods John Hooker Carnegie Mellon University Matheuristics 2012 Exact and Heuristic Methods • Generally regarded as very different. • Exact methods -- – Exhaustive search (branching, Benders, etc.) • Heuristic methods -- – Local search – Imitation of a natural process – Annealing, biological evolution Exact and Heuristic Methods • This has drawbacks. – Must switch algorithms when instances scale up. – Draws attention from underlying unity of methods. Exact and Heuristic Methods • This has drawbacks. – Must switch algorithms when instances scale up. – Draws attention from underlying unity of methods. • The ideal: – Single method transitions gracefully from exact to heuristic as instances scale up. – Cross-fertilization of ideas. – “Heuristic” = “search” or “find” (from the Greek Εὑρίσκω ) Exact and Heuristic Methods • Common primal-dual structure – Allows transfer of inference and relaxation techniques from exact methods to heuristics. – And transfer of local search ideas to exact methods. – For example, strong branching in MIP can be viewed as a local search method for solving a dual problem. Exact and Heuristic Methods • Another advantage of unification – There is no reason a priori that one metaheuristic should work better than another. – “No free lunch” theorem. Exact and Heuristic Methods • Another advantage of unification – There is no reason a priori that one metaheuristic should work better than another. – “No free lunch” theorem. – Solution methods must exploit problem structure. – “Full employment” theorem. Exact and Heuristic Methods • Another advantage of unification – There is no reason a priori that one metaheuristic should work better than another. – “No free lunch” theorem. – Solution methods must exploit problem structure. – “Full employment” theorem. – Inference and relaxation methods exploit problem structure . Outline • Primal-dual framework • Heuristic Methods – Inference dual – Local search – Relaxation dual – GRASP – Constraint-directed search – Tabu search – DPLL – Genetic algorithms • Exact Methods – Ant colony optimization – Simplex – Particle swarm optimization – Branch and bound • Summing up – Benders decomposition Outline • Caveat… – This is a high-level talk. – Don’t worry too much about technical details. Outline • Primal-dual framework • Heuristic Methods – Inference dual – Local search – Relaxation dual – GRASP – Constraint-directed search – Tabu search – DPLL – Genetic algorithms • Exact Methods – Ant colony optimization – Simplex – Particle swarm optimization – Branch and bound • Summing up – Benders decomposition Inference Dual • Find the tightest bound on objective function that can be deduced from the constraints. – Using a specified method of logical deduction. Primal : min{f ( x ) } x∈ S P Inference dual: maxvxS| ()()∈ ⇒ fx () ≥ v v, P where P belongs to a proof family Inference Dual • For example, LP dual: Primal min{cx | Ax≥ b } x≥0 P Dual max v| ()() Ax≥ b⇒ cx≥ b v, P where proof P is nonnegative linear combination i.e. uAx ≥ ub dominates cx ≥ v for u ≥ 0 Inference Dual • For example, LP dual: Primal min{cx | Ax≥ b } x≥0 P Dual max v| ()() Ax≥ b⇒ cx≥ b v, P where proof P is nonnegative linear combination i.e. uAx ≥ ub dominates cx ≥ v for u ≥ 0 i.e. uA ≤ c and ub ≥ v. This yields Classical dual max {ub| uA≤ c } u≥0 Inference Dual • Standard optimization duals are inference duals that use different inference methods. – LP dual – Nonnegative linear combination + domination – Surrogate dual – Same, but for NLP, IP – Lagrangean dual – Same, but with stronger form of domination – Subadditive dual – Subadditive homogeneous function + domination Relaxation Dual • Find a relaxation that the tightest bound on the objective function. – Relaxation is parameterized by dual variables . Primal : min{f ( x ) } x∈ S Relaxation dual: max{θ (u ) } u∈ U where θ()u= min{ fxu′ (,) } x∈ S′( u ) Relaxation of primal, parameterized by u Relaxation Dual • Example: Lagrangean dual. Primal : min{fx ( )| gx ( )≤ 0 } x∈ S Relaxation dual: max{θ (u ) } u∈ U where θ()u= min{ fx () + ugx () } x∈ S Primal-Dual Algorithm • Enumerate restrictions – Branching tree nodes – Benders subproblems – Local search neighborhoods • Derive bounds to prune search – From inference or relaxation dual. – For example, LP bounds in MIP. Primal-Dual Algorithm • Key issue: How restrictive are the restrictions? – Tighten restrictions until they can be solved – As in branching. – Relax restrictions until solution quality improves – As in large neighborhood search Primal-Dual Algorithm • Let inference dual guide the search – Constraint-directed search – Solution of inference dual provides nogood constraint, as in: – Branching (perhaps with conflict analysis) – SAT algorithms with clause learning – Benders decomposition – Dynamic backtracking – Tabu search Primal-Dual Algorithm • Let relaxation dual guide the search – Solution of relaxation suggests how to tighten it. – As when branching on fractional variables. Constraint-Directed Search • Start with example: SAT. – Solving with DPLL. – Use branching + unit clause rule: ∨ ∨ ∨ ∨ ∨ (xxxx1234 ) and x 2⇒ ( xxx 134 ) – Dual solution at infeasible node is unit clause proof . – Identify branches that play a role in the proof. – This yields a nogood constraint (conflict clause). Constraint-Directed Search ∨ ∨ x1 x5 x6 • The problem: ∨ ∨ x1 x5 x6 ∨ ∨ Find a satisfying x2 x5 x6 solution. ∨ ∨ x2 x5 x6 ∨ ∨ x1 x3 x4 ∨ ∨ x2 x3 x4 ∨ x1 x3 ∨ x1 x4 ∨ x2 x3 ∨ x2 x4 SARA 07 Slide 23 DPLL = x1 0 = x2 0 = x3 0 Branch to here. = x4 0 Solve subproblem with unit clause rule, which proves infeasibility. = x5 0 (x1, … x 5) = (0, …, 0) creates the infeasibility. Slide 24 DPLL = x1 0 = x2 0 = x3 0 Branch to here. = x4 0 Solve subproblem with unit clause rule, which proves infeasibility. = x5 0 (x1, … x 5) = (0, …, 0) creates the infeasibility. ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 Generate nogood. Slide 25 DPLL = x1 0 Consists of processed nogoods in iteration k = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨∨∨∨ 0 (0,0,0,0,0, ) x1 x 2 x 3 x 4 x 5 = ∨ ∨ ∨ ∨ x3 0 1 x1 x 2 x 3 x 4 x 5 = x4 0 Conflict clause appears as nogood = x5 0 induced by solution of Rk. ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 Slide 26 DPLL = x1 0 Consists of processed nogoods in iteration k = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨∨∨∨ 0 (0,0,0,0,0, ) x1 x 2 x 3 x 4 x 5 = ∨∨∨∨ ⋅ ∨∨∨∨ x3 0 1xxxxx12345 (0,0,0,0,1, ) xxxxx 12345 = x4 0 Go to solution that solves = = x5 0 x5 1 relaxation, with priority to 0 Slide 27 DPLL = x1 0 Consists of processed nogoods in iteration k = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨∨∨∨ 0 (0,0,0,0,0, ) x1 x 2 x 3 x 4 x 5 = ∨∨∨∨ ⋅ ∨∨∨∨ x3 0 1xxxxx12345 (0,0,0,0,1, ) xxxxx 12345 = x4 0 ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 Process nogood set with ∨ ∨ ∨ ∨ = = x1 x 2 x 3 x 4 x 5 parallel resolution x5 0 x5 1 ∨ ∨ ∨ x1 x 2 x 3 x 4 parallel-absorbs ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 Slide 28 DPLL = x1 0 Consists of processed nogoods in iteration k = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨∨∨∨ 0 (0,0,0,0,0, ) x1 x 2 x 3 x 4 x 5 = ∨∨∨∨ ⋅ ∨∨∨∨ x3 0 1xxxxx12345 (0,0,0,0,1, ) xxxxx 12345 ∨ ∨ ∨ 2 x1 x 2 x 3 x 4 = x4 0 ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 Process nogood set with ∨ ∨ ∨ ∨ = = x1 x 2 x 3 x 4 x 5 parallel resolution x5 0 x5 1 ∨ ∨ ∨ x1 x 2 x 3 x 4 parallel-absorbs ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 ∨ ∨ ∨ ∨ x1 x 2 x 3 x 4 x 5 Slide 29 DPLL = x1 0 Consists of processed nogoods in iteration k = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨∨∨∨ 0 (0,0,0,0,0, ) x1 x 2 x 3 x 4 x 5 = ∨∨∨∨ ⋅ ∨∨∨∨ x3 0 1xxxxx12345 (0,0,0,0,1, ) xxxxx 12345 ∨∨∨ ⋅ 2x1 x 2 x 3 x 4 (0,0,0,1,0, ) = x4 0 Solve relaxation again, continue. = x5 0 So backtracking is nogood-based search with parallel resolution Slide 30 Constraint-Directed Search • Use stronger nogoods = conflict clauses. • Nogoods rule out only branches that play a role in unit clause proof. Slide 31 DPLL with conflict clauses = x1 0 = x2 0 = x3 0 = x4 0 x = 0 Branch to here. Unit clause rule 5 proves infeasibility. (x1,x5) = (0,0) is only premise of unit clause proof. Slide 32 DPLL with conflict clauses = x1 0 = x2 0 kRelaxation RkSolution of R k Nogoods 0 (0,0,0,0,0,⋅ ) x ∨ x = 1 5 x3 0 ∨ 1 x1 x 5 = x4 0 Conflict clause appears as nogood = x5 0 induced by solution of Rk. ∨ x1 x 5 Slide 33 DPLL with conflict clauses = x1 0 Consists of processed nogoods = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨ 0 (0,0,0,0,0, ) x1 x 5 = x3 0 ∨ ⋅ ∨ 1xx1 5 (0,0,0,0,1, ) xx2 5 = x4 0 = = x5 0 x5 1 ∨ ∨ x1 x 5 x2 x 5 Slide 34 DPLL with conflict clauses = x1 0 Consists of processed nogoods = x2 0 kRelaxation RkSolution of R k Nogoods ⋅ ∨ 0 (0,0,0,0,0, ) x1 x 5 = x3 0 ∨ ⋅ ∨ 1xx1 5 (0,0,0,0,1, ) xx2 5 ∨ 2 x1 x 2 = x4 0 x∨ x 1 5 parallel-resolve to yield = = x∨ x x5 0 x5 1 2 5 x∨ x x∨ x parallel-absorbs 1 5 ∨ ∨ 1 2 x∨ x x1 x 5 x2 x 5 2 5 ∨ x1 x 2 Slide 35 DPLL with conflict clauses = x1 0 = = x2 0 x2 1 = x∨ x x3 0 1 2 x1 = x4 0 kRelaxation RkSolution of R k Nogoods 0 (0,0,0,0,0,⋅ ) x ∨ x = = 1 5 x5 0 x5 1 ∨ ⋅ ∨ 1xx1 5 (0,0,0,0,1, ) xx2 5 ∨ ⋅⋅⋅⋅ ∨ 2xx1 2 (0,1, , , , ) xx1 2 x∨ x x∨ x 1 5 2 5 3 x ∨ 1 x1 x 2 ∨ x1 x 2 Slide 36 ∨ parallel-resolve to yield x x1 x 2 1 DPLL with conflict clauses = = x1 0 x1 1 = x2 0 x1 = x∨ x x3 0 1 2 x1 = x4 0 kRelaxation RkSolution of R k Nogoods 0 (0,0,0,0,0,⋅ ) x ∨ x = = 1 5 x5 0 x5 1 ∨ ⋅ ∨ 1xx1 5 (0,0,0,0,1, ) xx2 5 ∨ ⋅⋅⋅⋅ ∨ 2xx1 2 (0,1, , , , ) xx1 2 ∨ ∨ x1 x 5 x2 x 5 ⋅ ⋅⋅⋅⋅ 3x1 (1, , , , , ) x 1 x∨ x 1 2 4 ∅ Slide 37 Search terminates Constraint-Directed Search • Suppose we search over partial solutions.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages156 Page
-
File Size-