
Evolution Strategy Programming (ESP) EBADA A. SARHAN IRAKY H.KHALIFA MOHAMED S. EMAM Faculty of Computers & Information Faculty of Computers & Information Helwan University Zagazig University Present by: Mohamed S. Emam Abstract: Evolutionary Algorithms are search candidate solutions, i.e., elements of the algorithms based on the Darwinian metaphor of function's domain, and apply the quality “Natural Selection”. Typically these algorithms function as an abstract fitness measure — the maintain a population of individual solutions, each higher the better. Based on this fitness, some of of which has a fitness attached to it, which in some the better candidates are chosen to seed the next way reflects the quality of the solution. The search generation by applying recombination and/or proceeds via the iterative generation, evaluation and possible. This paper presents a new Self-Adaptive mutation to them. Recombination is an operator Evolutionary Algorithms technique called Evolution applied to two or more selected candidates (the Strategy Programming (ESP) which is a so-called parents) and results one or more new combination of Evolution Strategy (ES) and candidates (the children). Mutation is applied to Evolutionary Programming (EP). Evolutionary one candidate and results in one new candidate. Algorithms rely on two genetic operators — Executing recombination and mutation leads to Crossover and Mutation, in case of real parameter a set of new candidates (the offspring) that representation (ES & EP), experiments show that compete — based on their fitness (and possibly mutation is more powerful than crossover so we age) — with the old ones for a place in the next concentrate our work on mutation. ESP performs generation. This process can be iterated until a mutation process by the same method of Evolution candidate with sufficient quality (a solution) is Strategies but with extra rule called random adaptation rule. We perform in this paper a found or a previously set computational limit is comparison between standard (µ,λ)-Evolution reached [2]. Strategy and Evolution Strategy Programming (ESP) In this process there are two fundamental on a highly multimodal function (Function after forces that form the basis of evolutionary Fletcher and Powell). systems. Key-Words: Evolutionary-Algorithms, Evolution- • Variation operators (recombination and Strategy, Evolutionary-Programming, Evolution- mutation) create the necessary diversity Strategy-Programming,, Mutation, Random- and thereby facilitate novelty, while Adaptation. • selection acts as a force pushing quality. 1. Introduction As the history of the field suggests there The combined application of variation are many different variants of Evolutionary and selection generally leads to improving Algorithms. The common underlying idea fitness values in consecutive populations. It is behind all these techniques is the same: given a easy (although somewhat misleading) to see population of individuals the environmental such a process as if the evolution is optimizing, pressure causes natural selection (survival of the or at least “approximizing”, by approaching fittest) and this causes a rise in the fitness of the optimal values closer and closer over its course. population. Given a quality function to be Alternatively, evolution it is often seen as a maximized we can randomly create a set of process of adaptation. From this perspective, the 1 fitness is not seen as an objective function to be optimized, but as an expression of Parent selection environmental requirements. Matching these Parents requirements more closely implies an increased Initialization viability, reflected in a higher number of Recombination offspring. The evolutionary process makes the Population population adapt to the environment better and Mutation better [3]. Termination Let us note that many components of Offspring such an evolutionary process are stochastic. Survivor selection During selection fitter individuals have a higher chance to be selected than less fit ones, but Fig. 2 The general scheme of an Evolutionary typically even the weak individuals have a Algorithm as a flow-chart chance to become a parent or to survive. For recombination of individuals the choice of which pieces will be recombined is random. 2. Evolution Strategy Similarly for mutation, the pieces that will be mutated within a candidate solution, and the Evolution strategies are similar to new pieces replacing them, are chosen genetic algorithms in that both attempt to find a randomly. The general scheme of an (near-)optimal solution to a problem within a Evolutionary Algorithm can is given in Figure 1 search space (all possible solutions to a in a pseudo-code fashion; Figure 2 shows a problem) without exhaustively testing all diagram. solutions. While evolution strategies are a joint development of Rechenberg, Biernert, and Schwefel, who did preliminary work in this area in the 1960s at the Technical University of Berlin (TUB) in Germany. BEGIN INITIALISE population with random Evolution strategies tend to be used for candidate solutions; empirical experiments that are difficult to model EVALUATE each candidate; mathematically. In this case, the system to be optimized is actually constructed. Evolution REPEAT strategies are based on the principal of strong 1 SELECT parents; causality, which states that similar causes have 2 RECOMBINE pairs of parents; similar effects. That is, a slight change to one 3 MUTATE the resulting offspring; encoding of a problem only slightly changes its 4 EVALUATE new candidates; optimality [5]. 5 SELECT individuals for the next generation; The (µ+λ)-ES and (µ,λ)-ES UNTIL ( TERMINATION CONDITION is The (µ+λ)-ES and (µ,λ)-ES were satisfied ) introduced by Schwefel, as we mentioned above END the (µ+λ)-ES is a natural extension of a multimembered evolution strategy (µ+1)-ES, where µ individuals produce λ offspring. The new (temporary) population of (µ+λ) individuals Fig. 1 The general scheme of an Evolutionary is reduced by selection process again to µ Algorithm in pseudo-code individuals. On the other hand, in the (µ,λ)-ES the µ individuals produce λ offspring 2 (consequently, λ>µ is necessary) and the comma (µ+λ & µ,λ), the advantage of plus selection process selects a new population of µ method is in the elitist process to preserve the individuals from the set of λ offspring only [5]. best individual during the evolution process since the best µ individuals out of the union of Currently, the (µ,λ)-ES characterizes the parents and offspring survive, but each state-of-the art in Evolution Strategy research individual consists of two parameters, object and is therefore the strategy that I used in my parameter and strategy parameter so as we study. As an introductory remark it should be preserve the object parameters we also preserve noted that the major quality of this strategy is the strategy parameters which is not useful for seen in its ability to incorporate the most the self-adaptation during the evolution process. important parameters of the strategy (standard Comma method perform the inverse process of deviations and correlation coefficients of plus method, here best individuals may be lost normally distributed mutations) into the search since the best µ offspring individuals form the process, such that optimization not only takes next parent generation (consequently, λ > µ is place on object variables, but also on strategy necessary), so the process of self-adaptation is parameters [4]. gone well. 3. Evolutionary Programming In Evolution Strategy Programming we use the (µ+µ) method to preserve the best The original Evolutionary Programming individuals during the evolution process and in method used uniform random mutations on the same time we adapt the strategy parameters discrete underlying alphabets and, in its most by making refresh to strategy part of all the elaborated form, a (µ+µ)-selection mechanism. individuals “that if the evolution process stop at Following on form the initial work of L. J. Fogel stationary fixed points for certain times of [Fog62, FOW66] this approach remained generations” by assign a new random values greatly underused for approximately thirty (from intervals smaller than that in the years. beginning) to the strategy parameters of all the individuals in the population. Then, in the late 1980s D. B. Fogel (his son) extended Evolutionary Programming for 4.1 ESP Process applications involving continuous parameter optimization problems. Evolutionary The process of Evolution Strategy Programming for continuous parameter Programming is the same as Evolution optimization has many similarities with Strategies (ES) and Evolutionary Programming Evolution strategies: mutations are normally (EP) process. The mutation calculations were distributed and, what is more interesting, the taken from (ES), selection was taken from more elaborated versions of Evolutionary (µ+λ)-Es, and there is no recombination as in Programming incorporate variances of (EP). The remaining operations are essentially mutations into the genotype (meta-EP), thus the same for both methods. The extra rule added facilitating the self-adaptation of these here is the random adaptation rule that occurs parameters [3]. within the mutation method during the evolution process. 4. Evolution Strategy Programming Each population member of the ESP was ESP is a new self-Adaptive Evolutionary composed of two n-dimensional vectors. The r n Algorithms technique that combines the first was the population vector x ∈ ℜ and the advantages of Evolution Strategy and second was a corresponding standard deviation Evolutionary Programming. In Evolution vector σr ∈ ℜn used in mutation. Thus each Strategies rely we have two methods plus and 3 member a was constructed as ar = (xr,σr) . The yes, then c is set to c/2 then refresh all the population at any given time step of the interval strategy parameters values of all the individuals r is denoted as P(t). in the population (i.e. each σij of the vector σ i is a new random value from the interval [0,c] 1) An initial population P(0) of µ members is ∀j∈{1, ,n}, and ∀i ∈{1, , µ} , otherwise do r r r K K generated consisting of ai = (xi ,σ i ) , nothing more.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-