An Evolutionary Algorithm for Minimizing Multimodal Functions

An Evolutionary Algorithm for Minimizing Multimodal Functions

An Evolutionary Algorithm for Minimizing Multimodal Functions D.G. Sotiropoulos, V.P. Plagianakos, and M.N. Vrahatis Department of Mathematics, University of Patras, GR–261.10 Patras, Greece. University of Patras Artificial Intelligence Research Center–UPAIRC. e–mail: {dgs,vpp,vrahatis}@math.upatras.gr Abstract—A new evolutionary algorithm for are precluded. In such cases direct search approaches the global optimization of multimodal functions are the methods of choice. is presented. The algorithm is essentially a par- allel direct search method which maintains a If the objective function is unimodal in the search populations of individuals and utilizes an evo- domain, one can choose among many good alterna- lution operator to evolve them. This operator tives. Some of the available minimization methods in- has two functions. Firstly, to exploit the search volve only function values, such as those of Hook and space as much as possible, and secondly to form Jeeves [2] and Nelder and Mead [5]. However, if the an improved population for the next generation. objective function is multimodal, the number of avail- This operator makes the algorithm to rapidly able algorithms is reduced to very few. In this case, converge to the global optimum of various diffi- the algorithms quoted above tend to stop at the first cult multimodal test functions. minimum encountered, and cannot be used easily for finding the global one. Index Terms—Global optimization, derivative free methods, evolutionary algorithms, direct Search algorithms which enhance the exploration of the search methods. feasible region through the use of a population, such as Genetic Algorithms (see, [4]), and probabilistic search techniques such as Simulated Annealing (see, [1, 7]), I. Introduction has been the object of many publications. In analogy with the behavior of natural organisms, random search In this contribution we propose a new evolutionary al- algorithms have often called evolutionary methods. The gorithm for global minimization of nonlinear and non- generation of a new trial point corresponds to mutation differentiable continuous functions. More formally, we while a step toward the minimum can be viewed as se- consider the following general global optimization prob- lection. lem: In the present work we propose an evolutionary algo- f ∗ = min f(x), x∈D rithm in which we enhance the exploration of the feasi- ble region utilizing a specially designed evolution oper- where f : D → R is a continuous function and the com- n ator, called Simplex Operator, and based on the idea of pact set D ⊂ R is an n-dimensional parallelepiped. simplex search method. It starts with a specific num- Often, analytic derivatives of f are unavailable par- ber of N dimensional candidate solutions, as an initial ticularly in experimental settings where evaluation of population, and evolves them over time, in such a way the function means measuring some physical or chem- that each new iteration of the algorithm gives a better ical quantity or performing an experiment. Inevitably, population. noise contaminates the function values and as a conse- quence, finite difference approximation of the gradient The remaining of the paper is organized as follows: In is difficult, if not impossible, the procedure to estimate Section 2 we present our Simplex Evolution algorithm the gradient fails, and methods that require derivatives (SE) for minimizing multimodal functions. The algo- rithm has been tested against the Annelead Nelder and A. The Simplex Operator Mead simplex method and the corresponding results are presented in Section 3. The final section contains The Simplex Operator is a specially designed evolution concluding remarks and a short discussion for further operator with deterministic steps that enhances the ex- work. In the appendix we list the problems used in our ploitation of the search space. This operator amounts tests from Levy et al. [3]. to replacing a single individual, as a description of the system state, by a simplex of N + 1 points. In prac- tice, we are only interested in nondegenerate simplexes. II. The Simplex Evolution The moves of the simplex are the same as described in [5, 6], namely reflection, expansion, and contraction. The least preferred point of the simplex is reflected Simplex Evolution is based on the collective learning through or contracted towards the center of the oppo- process within a population of individuals, each of site face of the simplex. Actually, this operator per- which represents a search point in the space of potential forms only one cycle of the Simplex method as shown solutions to a given problem. The population is cho- in the following algorithm which gives the basic steps sen randomly and should try to cover the entire search of the operator: space D, and evolves towards better and better regions of the search space by means of deterministic steps that SimplexOperator(BasePoint x ) use randomly selected individuals. The number of the 1 1: Select points x , where i = 2,...,N + 1; candidate solutions does not change during the mini- i 2: Construct the simplex S = hx , x , . , x i; mization process. At each iteration, called generation, 1 2 N+1 3: if S is degenerate go to 1; the evolution of the population occurs through the use 4: Rank x based on the function values; of an operator called mutation. i 5: Reflect(S) → xr; The key idea is to construct a mechanism, which at 6: if f(xr) < f(x1) then each generation takes each individual of the population, 7: Expand(S) → xe; and replace it with a better one for the next generation. 8: if f(xe) < f(xr) then To this end, we have specially redesign the traditional 9: return the expansion point xe; mutation operator in order to enhance the process of 10: else the exploitation of the search space, in such a manner 11: return the reflection point xr; that typically avoids getting stuck in local minimum 12: end if points. 13: else 14: Contract(S) → xc; The proposed mutation operator, called Simplex Oper- 15: if f(xc) < f(xr) then ator, is based on a simplex search, which is a geomet- 16: return the contraction point xc; rical figure consisting in N dimensions of N + 1 points 17: else and all interconnecting lines and faces. In general we 18: return the best vertex of the simplex S; are interested only in simplexes that are nondegenerate, 19: end if i.e., that enclose a finite inner N–dimensional volume. 20: end if For each candidate solution, called BasePoint, of the current generation, we choose N other candidates and In line 3 we check if the simplex S is degenerate, regard- we form a simplex. If we assume that the BasePoint ing to the function values of its vertices. Specifically, of a nondegenerate simplex is taken as the origin, then we check if the standard deviation of the function at the N other points define vector directions that span the N +1 vertices of the current simplex is smaller than the N–dimensional vector space. some prescribed small quantity ε, i.e. In the sequence, for this simplex we perform two of the 1/2 (N+1 2 ) X [f(xi) − f(x0)] three possible trial steps: a reflection, an expansion or σ = ≤ ε, (1) a contraction step, in order to approach the minimum N + 1 i=1 by moving away from high values of the objective func- tion, without moving out of the search domain. The where x0 is the centroid of the simplex S. In the im- current point is replaced in the next generation by the plementation of the algorithm we have used ε = 10−15. resulted mutated one. This last operation is called se- There is no case of an infinite loop between the lines 1 lection and produces an improved population for the and 3, where the simplex S is examined for degeneracy, next generation. since in the SE algorithm (see the next subsection), the (1) has been enforced as a termination criterion for the Classical termination criteria for the evolution algo- entire population. Therefore, we can form at least one rithms are: a) maximum number of generations, and non-degenerate simplex. Using (1) we avoid unneces- b) maximum number of function evaluations. Obvi- sary function evaluations. ously, if the global minimum value is known, then an ef- ficient termination criterion can be defined; the method To guarantee convergence in the feasible region D, we is stopped when finds the minimizer within a predeter- have constrained the operations reflection and/or ex- mined accuracy. This happens, for example, in the case pansion (lines 5 and 7), in a such a way that the re- of the non–linear least squares. turned point is always in D. If the returned point is outside of the feasible region D, we subdivide the pre- Except the previous ones we can enforce an additional fixed values of reflection and/or expansion constants termination criterion, since SE is based on the sim- until the new point is included in D. plex method. This criterion checks at each generation if the entire population has converged, examining the If the simplex operator fails to improve the base point Relation (1). If the standard deviation of the function x1 during the reflection, expansion or contraction oper- values is smaller than ε, then this is a strong indica- ations then we return the best vertex of the simplex S. tion that further search is impossible, since we cannot Simplex Operator has embedded (in lines 9, 11, 16, 18) form simplexes big enough to exploit the search space. the selection step, as it returns always a better individ- Therefore, we can stop the algorithm, or possibly, rank ual in order to replace the BasePoint.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    5 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us