
Adapting Op erator Settings In Genetic Algorithms Andrew Tuson and Peter Ross Department of Articial Intelligence University of Edinburgh South Bridge Edinburgh EH HN UK Email fandrewtp etergaisbedacuk Tel Key Words coevolution op erator adaptation COBRA selfadaptation op erator settings Abstract In the vast ma jority of genetic algorithm implementations the op erator settings are xed throughout a given run However it has sometimes b een argued that these settings should vary over the course of a genetic algorithm run so as to account for changes in the ability of the op erators to pro duce children of increased tness This pap er describ es an empirical investigation into this question The eect up on genetic algorithm p erformance of two adaptation metho ds up on b oth wellstudied theoretical problems and a hard problem from Op erations Research the owshop sequencing problem is examined The results obtained indicate that the applicability of op erator adaptation is problemdep endent Adapting Op erator Settings In Genetic Algorithms Andrew Tuson and Peter Ross Department of Articial Intelligence University of Edinburgh South Bridge Edinburgh EH HN UK Email fandrewtp etergaisbedacuk Tel Key Words coevolution op erator adaptation COBRA selfadaptation op erator settings Abstract In the vast ma jority of genetic algorithm implementations the op era tor settings are xed throughout a given run However it has sometimes b een argued that these settings should vary over the course of a genetic al gorithm run so as to account for changes in the ability of the op erators to pro duce children of increased tness This pap er describ es an empirical investigation into this question The eect up on genetic algorithm p er formance of two adaptation metho ds up on b oth wellstudied theoretical problems and a hard problem from Op erations Research the owshop sequencing problem is examined The results obtained indicate that the applicability of op erator adaptation is problemdep endent Intro duction It has long b een acknowledged that the choice of op erator settings has a sig nicant impact up on genetic algorithm p erformance However nding a go o d choice is somewhat of a black art The appropriate settings dep end up on the other comp onents of the genetic algorithm such as the p opulation mo del the problem to b e solved its representation and the op erators used The large numb er of p ossibilities precludes an exhaustive search of the space of op erator probabilities The ab ove also ignores the case for varying op erator settings There is evidence b oth empirical and theoretical that the most eective op erator set tings do vary during the course of a genetic algorithm run For instance Davis Davis advo cates the use of a timevarying schedule of op erator probabil ities and nds that p erformance is improved esp ecially when a large numb er of op erators are used Theoretical work Muhlen b ein has analysed the mutation op erator for a few binary co ded problems and concluded that the mutation parameter should b e decreased the nearer to the optimum the genetic algorithm is Time dep endency was also discovered for mutation parameters by Hesser and Manner Hesser and Manner The problem lies in devising such a schedule this is harder than nding a go o d set of static op erator settings It may b e advantageous therefore to employ a metho d that dynamically adjusts the op erator probabilities according to a measure of the p erformance of each op erator So by what criterion should we judge the p erformance of an op erator at a given p oint The ability of an op erator to pro duce new preferably t ter children may b e what is required this has b een suggested b efore by Sp ears and DeJong but the emphasis here is on the p otential of an op erator to pro duce children of increased tness op erator pro ductivity Clearly this is necessary for optimisation to progress the aim of a genetic algorithm is after all to uncover new tter p oints in the search space In fact the overall p erformance of a genetic algorithm dep ends up on it maintaining an acceptable level of pro ductivity throughout the search This concept is based up on work by Altenb erg Altenb erg which in tro duced the somewhat more general concept of evolvability the ability of the op eratorrepresentation scheme to pro duce ospring that are tter than their parents This idea has some supp ort from work by Muhlen b ein discussed ab ove which tried to derive an optimal value for the mutation parameter so as to maximise the probability of an improvement b eing made This pap er describ es an investigation of some of these metho ds An Overview of Op erator Adaptation Considering the argument made ab ove the purp ose of dynamic op erator adap tion is to exploit information gained either implicitly or explicitly regarding the current ability of each op erator to pro duce children of improved tness Other metho ds do exist that adjust op erator setting based on other criteria such as the diversity of the p opulation for example Coyne and Paton but these will not b e considered in this pap er Adaptation metho ds can b e divided into two classes for a review of work on adapting op erator settings see Tuson The direct enco ding of op erator probabilities into each memb er of the p opulation allowing them to coevolve with the solution The use of a learningrule to adapt op erator probabilities according to the quality of solutions generated by each op erator The following terminology will b e used Each op erator available to the ge netic algorithm has a probability of b eing red an operator probability This study makes a distinction b etween this and any parameters asso ciated with a given op erator henceforth an operator parameter For example a genetic algo rithm could use uniform crossover op erator probability of the time along with mutation of the time with the mutation op erator p ossessing a bitwise mutation rate of op erator parameter The term operator setting will b e taken to mean b oth of the terms ab ove Adaptation by Coevolution Op erator adaptation metho ds based on the coevolutionary metaphor also re ferred to as Selfadaptation enco de the op erator settings onto each memb er of the p opulation and allow them to evolve The rationale b ehind this is as follows solutions which have enco ded op erator settings that tend to pro duce tter children will survive longer and so the useful op erator settings will spread through the p opulation The original work in this area originated from the Evolution Strategy community see Back for a review The mutation op erator in such algorithms involves changing each gene by a value taken from a Gaussian distribution with a standard deviation describ ed by a gene elsewhere on the chromosome this parameterises the amount of disruption that mu tation pro duces when creating a child These op erator parameters are allowed to evolve to suitable values Work extending this to adapting the mutation pa rameter for more conventional genetic algorithm implementations has rep orted some success Back Back in the sense that the mutation parame ter was seen to adapt to the theoretical optimum for the theoretically tractable simple problem b eing considered In the study in this pap er the op erator probabilities are enco ded as oating p oint numb ers on the range to with the constraint that the sum of the op erator probabilities must b e equal to one An example is given in Figure Op erator parameters were enco ded in a similar fashion but without the constraint The meaning of a particular op erator parameter dep ends up on the op erator it is asso ciated with For example in the case of parameterised uniform crossover the enco ded parameter is taken to b e the probability that a given gene in a child is from the second parent In the case of binary mutation the parameter is scaled to a bitwise mutation probability of to l where l is the length of the binary string The value of l was selected as it is appreciably larger than the l which several authors agree is the optimal value for some problems eg Muhlen b ein and SchlierkampVo osen The op eration pro cess is then as follows Select a parent on the basis of tness 1 0 0 10 0 1 1 0 1 0 0.1 0.5 0.4 Operator Probabilities Part of String Encoding Candidate Solution Figure Representing Op erator Probabilities Extract the enco ded op erator probabilities from the parent Use these op erator probabilities to determine sto chastically which op er ator is used on the solution part of the string Apply the chosen op erator to the solution part of the parent Determine sto chastically using a coevolution crossover probability the op erator coevolution crossover or mutation that will b e applied to the enco ded op erator settings Then apply it Renormalise so that the enco ded op erator probabilities sum to one re normalisation is not required for the enco ded op erator parameters Each of the exp eriments b elow can undergo coevolution on two levels One approach uses an externally set coevolution crossover probability which sets the probability of using the coevolution crossover op erator A higher level approach enco des this onto the string also this is often termed metalearning The exp eriments p erformed are describ ed b elow Two typ es of coevolution op erators were investigated The rst typ e were strongly disruptive in other words children tend to b e quite dierent from their parents This is to see if the choice of coevolution op erators is imp ortant for op erator adaptation The crossover
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages25 Page
-
File Size-