Quick viewing(Text Mode)

Local Search

Local Search

Local Search

Combinatorial Optimization Problems

F: c: objective function find a point f ∈ F which minimizes c.

Neighborhood * points which are close in some fashion F * N: F → 2 Local Search

Graph Exploration

Move from one node to one of its neighbors Local Improvement

Local Minima A feasible solution f is locally optimal w.r.t. N if c( f ) ≤ c( g) ∀g ∈ N ( f )

Basic idea behind local improvement “improve an initial solution until a local minima is found”

improve(f) = { any s∈ N ( f ) such that c(s) < c(f) ʻnoʼ otherwise Local Search

r := some initial starting point F; while improve(r) != ‘no’ do r := improve(r); return(r); Graph Partitioning We are given a graph G = (V, E) with |V| = 2n and a cost dij on each edge (xi, xj).

Find a partition V = A ∪ B (A ∩ B ≠ 0) with |A| = |B| such that:

∑ dij is minimized i ∈ A, j ∈ B Graph Partitioning

Starting points Split the vertices in two sets Neighborhood N(f) where f = (A, B)

N((A, B)) = { (Aʼ, Bʼ) | A′ = A – ({e1} ∪ {e2})

B′ = B – ({e2} ∪ {e1}) e1 ∈ A e2 ∈ B} “Swap two elements”

Swap, Swap, Swap, Swap • Kernighan & Lin (during the golden sixties) Graph Partitioning

Important Issues in implementing local search

• Incremental data structures • meta-heuristics (randomness component) Graph Partitioning

Updating the cost function

(A, B) → ca

(Aʼ, Bʼ) → ? d External Cost E(a) = ∑ ai i ∈ B

d Internal Cost l(a) = ∑ ai i ∈ A

Delta

D(a) = E(a) – l(a) Gain/Cost for a move

g(a, b) = D(a) + D(b) – 2dab Hence

c(Aʼ, Bʼ) = c(A, B) – g(a, b) Graph Partitioning

Updating the external and internal costs

* Compute D(v) for all v∈ V

* swap (a,b) Dʼ(x) = Eʼ(x) – lʼ(x)

Eʼ(x) = E(x) + dxa – dxb lʼ(x) = l(x) + dxb – dxa

Dʼ(x) = E(x) – l(x) + 2dxa - 2dxb Graph Partitioning

Neighborhood

• What is the time required to search the neighborhood? KEYWORD

• 2-opt

New Ideas • How about exchanging 2 elements of A with 2 elements of B? More KEYWORDS: 3-opt, 4-opt, 5-opt

In general

(A, B) → (A*, B*) A* = A U X \ Y B* = B U Y \ X Size of |X| may be large! Variable Depth Local Search

Basic idea • replace the notion of one favorable swap by a search for a favorable sequence of swaps • do not search the complete neighborhood of sequences but use the costs to guide the search. Algorithm • compute D(v) for all v∈ V

• choose a1ʼ, b1ʼ so that g = D(a′ ) + D(b′ ) – 2d 1 1 1 a′1b′1 is as large as possible

• swap a1ʼ, b1ʼ and recompute D values D′(x) = D(x) + 2d – 2d x ∈ A – {a′ } xa′1 xb′1 1 D′( y) = D( y) + 2d – 2d y ∈ B – {b′ } yb′1 ya′1 1 • repeat the process to obtain a sequence of pairs

(aʼ2, bʼ2), … , (aʼn, bʼn) where all the elements are distinct Variable Depth Local Search

Exchanging {a′1, …, a′k} with {b′1, …, b′k} k gives g(k) = ∑ gi i = 1 as a decrease of increase. What is g(n)?

Select k such that g(k) is maximized

if g(k)≤ 0; terminate else exchange the sets and start over Graph Partitioning

Complexity of one step

• quadratic in the number of vertices • 100,000 vertices Can we get a linear neighborhood? • which preserves, more or less, the quality of the solutions Basic idea • Consider only the “best” vertices in each set, i,e, the vertices with the best possible gain • Swap only those Observation • It is possible to find these in linear time • It is possible to maintain them incrementally in an optimal fashion • The quality of the solution does not really deteriorate Summary So Far

Local Search • graph exploration • starts from a node • moves from a node to one of its neighbors

What is a neighbor? • swapping • choosing a good neighborhood is not easy Properties of neighborhoods • size • connected: can I move to the optimal solution by a sequence of moves? Heuristics • local improvement (first, best) Meta-heuristics • variable depth local search Graph Partitioning

The neighborhood so far • assume that we have a feasible solution • this may restrict the search

Relaxing feasibility • allow nodes to represent non-feasible solutions • use a penalty term in the objective function to drive the search towards feasible solutions Graph Partitioning

Non-feasible neighborhood

• move a vertex from one set to the other

How to express the objective function? • represent imbalance

IMB = |A| - |B| • the objective function to minimize becomes

f + α · IMB2

We have an again • when we reach a local minima, we need to test whether this is a feasible solution or not The General Model

begin s := startState(); for search := 1 to MaxSearches while GC do for trial := 1 to MaxTrials while LC do if satisfiable(s) then if value(s) < bestBound then bestBound := value(s); best = s; select n in neighborhood(s); if acceptable(n) then s := n; s := restartState(s); end;

Simulated Annealing (Kirkpatrick et al) • name comes from cooling procedures Basic ideas • introduces random moves to escape local minima

Main Procedure

• Select a neighbor randomly • If it improves the objective function, take the move • Otherwise, take the move with a probability e-Δ/T • parameter: T (cooling temperature) • Each time you are restarting the search, Update the temperature to T · factor; Simulated Annealing

Properties of Simulated Annealing

• If the neighborhood is connected, it converges toward the optimal solution • But it may take longer than exhaustive search

Observation in practice

• May be amazingly effective • May be slow • Parameter tuning Tabu Search

Tabu-Search (Glover) • Introduce a list of tabu moves • Greedy approach Main idea • Take the best move which is not tabu • Keep a list of moves which are tabu (e.g. last 10 moves) Advanced Ideas

• aspiration criteria: it is not tabu after all • dynamic list - decrease the length for good moves - increase the length for bad moves Observations • impossible to prove anything theoretically • works amazingly well in practice Satisfiability

Set of Clauses

l1 & l2 & … & lm

Literals

• an atom a or its negation aʼ The Goal • Finding an assignment of values to atoms so that all clauses are satisfied GSAT

Kautz and Selman in 1992

Basic idea • Select the variable which, when flipped, produces the largest number of clauses satisfied • If the best move so-defined produces a decrease in the objective function, take the move

Neighborhood • 1-opt • local non-degradation Observe • transforming a decision problem into an optimization problem GSAT

How to implement GSAT efficiently? • Maintain incrementally the gain/loss produced by flipping a variable • Maintaining incrementally the best variable to flip

What does that mean? For a clause

p1 & … & pk & n1 & … & nl Maintain the number of true literals A gain corresponds to a favorable flip

• when nbtl = 0 A loss corresponds to a defavorable flip • when nbtl = 1 GSAT-II: THE RETURN

Paper in 1993: Add weights 1. Run GSAT 2. If the problem is not solved, add 1 to the weights of the unsatisfied clause and go back to 1 3. Otherwise, go for a beer.

Basic idea: • focus on the clauses which are difficult to satisfy GSAT-III:

Add noise (Paper in 1994) • if the move does not improve, take it anyway with low probability Local Search

Starting points • can be started from a number of existing or generated solutions Meta-Heuristics • iterative improvement • variable-depth local improvement • simulated annealing • tabu-search Neighborhood

• define the quality of the solution • feasible vs. infeasible solutions Searching the neighborhood • first improvement • best improvement • random move