
From: AAAI-00 Proceedings. Copyright © 2000, AAAI (www.aaai.org). All rights reserved. Dynamic Representations and Escaping Local Optima: Improving Genetic Algorithms and Local Search Laura Barbulescu, Jean-Paul Watson, and L. Darrell Whitley Computer Science Department Colorado State University Fort Collins, CO 80523 e-mail: {laura,watsonj,whitley}@cs.colostate.edu Abstract Gray code can induce no more optima than exist in the orig- inal function. Further, because there are more neighbors un- Local search algorithms often get trapped in local der the Gray code (l bits for each dimension) than in the dis- optima. Algorithms such as tabu search and simu- lated annealing 'escape' local optima by accepting non- cretized real-valued function (2 for each dimension), there improving moves. Another possibility is to dynamically are usually signi®cantly fewer optima in the Gray code's change between representations; a local optimum un- search space. In contrast, Binary codes often create new op- der one representation may not be a local optimum un- tima where none existed in the original function. Finally, der another. Shifting is a mechanism which dynamically many distinct Gray codes exist, each inducing a different switches between Gray code representations in order to neighborhood structure, and potentially different local op- escape local optima. Gray codes are widely used in con- tima. junction with genetic algorithms and bit-climbing algo- These properties motivated (Rana & Whitley 1997) to in- rithms for parameter optimization problems. We present troduce shifting as a mechanism for changing between a re- new theoretical results that substantially improve our stricted set of Gray codes in an effort to escape local optima. understanding of the shifting mechanism, on the number of Gray codes accessible via shifting, and on how neigh- A simple hill-climbing algorithm and a state-of-the-art GA borhood structure changes during shifting. We show were augmented with the shifting mechanism, and evaluated that shifting can signi®cantly improve the performance using test functions empirically proven to be both resistant to of a simple hill-climber; it can also help to improve one simple hill-climbing algorithms, and still pose a challenge to of the best genetic algorithms currently available. GAs. We establish a new bound on the number of unique Gray codes accessible via shifting. This allows us to focus on Introduction a signi®cantly smaller set of representations. We demon- Given a representation and a neighborhood operator, local strate that similar Gray codes induce similar search spaces; search methods (Aarts & Lenstra 1997) proceed by gradual to escape optima, one must consider dissimilar Gray codes. manipulation of some initial solution. Because of its my- We use these results to improve the performance of a hill- opic nature, local search can become trapped in local op- climbing algorithm, to the point where it is competitive with tima. Methods such as simulated annealing and tabu search a state-of-the-art GA. attempt to 'escape' by accepting non-improving or inferior neighbors, with the goal of moving out of the local opti- On Dynamic Representations mum's basin of attraction. Local optima are induced by the For any function, there are multiple representations which selected representation and neighborhood operator; a local make optimization trivial (Liepins & Vose 1990). However, optimum under one representation may not be a local opti- the space of all possible representations is a larger search mum under another representation. Thus, dynamic represen- space than the search space of the function being optimiz- tations are a potentially important, although relatively unex- ing. Furthermore, randomly switching between representa- plored, class of escape mechanism. tions is doomed to failure. Suppose we have N unique points We focus on parameter optimization. Functions are dis- in our search space and a neighborhood operator which ex- cretized so that search proceeds in a bit space, with L-bit plores k points before selecting a move. A point is con- Gray or Binary encoded function inputs; this is common sidered a local optimum from a steepest-ascent perspective with Genetic Algorithms (GAs). Empirically, Gray encod- if its evaluation is better than each of its k neighbors. We ings usually perform better than Binary encodings for many can sort the N points in the search space to create a rank- real-world functions (Caruana & Schaffer 1988). Gray codes ing, R = r ,r , ..., r , in terms of their function evaluation preserve the neighborhood structure of the discretized real- 1 2 N (where r is the best point in the space and r is the worst valued search space (Whitley et al. 1996). As a result, a 1 N point in the space). Using this ranking, we can compute the Copyright c 2000, American Association for Arti®cial Intelli- probability that a point ranked in the i-th position in R is a gence (www.aaai.org). All rights reserved. local optimum under an arbitrary representation of the search space: in the original function. Furthermore, the surface of the orig- N−i inal function is a subset of the set of paths induced by the k ≤ ≤ − Gray encoding; hence the surface of the original function P (i)= N−1 [1 i (N k)] (1) k is preserved and enhanced with greater connectivity by the Using this result, one can prove that the expected number Gray code (Rana & Whitley 1997). In contrast, the standard of local optima under all possible representations for a search Binary coding actually increases the number of optima for many test functions (Whitley 1999). This is consistent with space with N Ppoints and any neighborhood operator of size N−k the fact that, in practice, search algorithms appear to work k is given by P (i)=N/(k +1)(Whitley, Rana, & i=1 better under Gray than Binary encodings. Heckendorn 1997). We now better understand why shifting works. Re¯ected These equations make it clear that highly ranked points Gray codes form a circuit. This circuit represents the inputs in the search space are local optima under almost all rep- to a real-valued function. In a Gray coded string of length resentations. To exploit dynamic representations we cannot L, there are L folds, or re¯ections. As the circuit is shifted, randomly change representations. So how do we proceed? points pass by other points in different re¯ections. This can First, we assume that most real-world optimization prob- be seen in Figure 1 for L =4. The 4 neighbors are North- lems have a complexity which is less than that expected from South-East-West and the graph is a torus. At the order-3 re- random functions. Two measures of this complexity are: ¯ection, strings differ only in the 3rd bit; this connects the smoothness and number of local optima. We have shown North-South neighbors in rows 1 and 2, as well as rows 3 and empirically that many test functions and real-world problems 4. The directional arrows show that these points move in op- tend to be relatively smooth compared to random functions, posite directions when shifting occurs, and hence neighbors and that the number of induced local optima are fewer than ¯ow past each other. The order-4 re¯ection (where the 4th the expected number of local optima associated with random bit differs) are the North-South connections between rows 2 functions (Rana & Whitley 1997). and 3, as well as the toroidal North-South wrap around be- It follows that one should use a form of dynamic repre- tween rows 1 and 4. When two local optima ªpassº one an- sentation that respects and preserves smoothness and which other in the shifted Gray encoding, one of the two optima bounds the number of local optima that can result while must collapse. For example, in Figure 1 positions 4 and 9 changing problem representation. Dynamic Gray codes have are not neighbors in the normal integer mapping induced un- these desirable properties. der standard Gray code. However, when the integer space is shifted by 1, positions 4 and 9 become neighbors. If there Gray Codes and Shifting are local optima at positions 4 an 9, one of these optima must A Gray code is any integer bit encoding such that adjacent collapse when the search space is shifted. integers are Hamming distance-1 apart. The standard re- We ®rst prove that all permutations of the columns of the ¯ected Gray encoding of an integer is constructed by apply- Gray transform matrix G yields a transform matrix that in- ing the exclusive-or operator to a) the standard Binary encod- duces an identical neighborhood structure. ing of the integer and b) the same Binary encoding, shifted Theorem 1 Let G be the L-bit standard re¯ective Gray one position to the right; the last bit is then truncated. Gray transform matrix, and construct Gπ by applying a permuta- encoding and decoding can be concisely expressed through tion π to the columns of G.Letx be some integer, and let x matrix operations. Let x and y be L-bit Binary-encoded and G and xGπ be the set of L neighboring integers under G and Gray-encoded integers, respectively, and let G be a trans- π G , respectively. Then x = x π . form matrix containing 1's on both the diagonal and up- G G per minor diagonal and 0's elsewhere. The Gray encoding Proof: The columns of G and Gπ independently produce T and decoding processes are then simply given by x G and a single bit of xG and xGπ , respectively; viewing π as a per- T −1 y G , respectively. mutation of the bits of xG or the columns of G is equivalent.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-