Adapting Op erator Settings In Genetic

Algorithms

Andrew Tuson and Peter Ross

Department of Articial Intelligence University of Edinburgh

South Bridge Edinburgh EH HN UK

Email fandrewtp etergaisbedacuk

Tel

Key Words co op erator COBRA

selfadaptation op erator settings

Abstract

In the vast ma jority of genetic implementations the op erator settings

are xed throughout a given run However it has sometimes b een argued that these

settings should vary over the course of a genetic algorithm run so as to account for

changes in the ability of the op erators to pro duce children of increased tness This

pap er describ es an empirical investigation into this question The eect up on genetic

algorithm p erformance of two adaptation metho ds up on b oth wellstudied theoretical

problems and a hard problem from Op erations Research the owshop sequencing

problem is examined The results obtained indicate that the applicability of op erator

adaptation is problemdep endent

Adapting Op erator Settings In Genetic

Algorithms

Andrew Tuson and Peter Ross

Department of Articial Intelligence University of Edinburgh

South Bridge Edinburgh EH HN UK

Email fandrewtp etergaisbedacuk

Tel

Key Words coevolution op erator adaptation COBRA

selfadaptation op erator settings

Abstract

In the vast ma jority of genetic algorithm implementations the op era

tor settings are xed throughout a given run However it has sometimes

b een argued that these settings should vary over the course of a genetic al

gorithm run so as to account for changes in the ability of the op erators

to pro duce children of increased tness This pap er describ es an empirical

investigation into this question The eect up on genetic algorithm p er

formance of two adaptation metho ds up on b oth wellstudied theoretical

problems and a hard problem from Op erations Research the owshop

sequencing problem is examined The results obtained indicate that the

applicability of op erator adaptation is problemdep endent

Intro duction

It has long b een acknowledged that the choice of op erator settings has a sig

nicant impact up on genetic algorithm p erformance However nding a go o d

choice is somewhat of a black art The appropriate settings dep end up on the

other comp onents of the genetic algorithm such as the p opulation mo del the

problem to b e solved its representation and the op erators used The large

numb er of p ossibilities precludes an exhaustive search of the space of op erator

probabilities

The ab ove also ignores the case for varying op erator settings There is

evidence b oth empirical and theoretical that the most eective op erator set

tings do vary during the course of a genetic algorithm run For instance Davis

Davis advo cates the use of a timevarying schedule of op erator probabil

ities and nds that p erformance is improved esp ecially when a large numb er

of op erators are used

Theoretical work Muhlen b ein has analysed the mutation op erator

for a few binary co ded problems and concluded that the mutation parameter

should b e decreased the nearer to the optimum the genetic algorithm is Time

dep endency was also discovered for mutation parameters by Hesser and Manner

Hesser and Manner

The problem lies in devising such a schedule this is harder than nding

a go o d set of static op erator settings It may b e advantageous therefore to

employ a metho d that dynamically adjusts the op erator probabilities according

to a measure of the p erformance of each op erator

So by what criterion should we judge the p erformance of an op erator at

a given p oint The ability of an op erator to pro duce new preferably t

ter children may b e what is required this has b een suggested b efore by

Sp ears and DeJong but the emphasis here is on the p otential of an op

erator to pro duce children of increased tness op erator pro ductivity Clearly

this is necessary for optimisation to progress the aim of a genetic algorithm

is after all to uncover new tter p oints in the search space In fact the overall

p erformance of a genetic algorithm dep ends up on it maintaining an acceptable

level of pro ductivity throughout the search

This concept is based up on work by Altenb erg Altenb erg which in

tro duced the somewhat more general concept of evolvability the ability of

the op eratorrepresentation scheme to pro duce ospring that are tter than

their parents This idea has some supp ort from work by Muhlen b ein discussed

ab ove which tried to derive an optimal value for the mutation parameter so as

to maximise the probability of an improvement b eing made

This pap er describ es an investigation of some of these metho ds

An Overview of Op erator Adaptation

Considering the argument made ab ove the purp ose of dynamic op erator adap

tion is to exploit information gained either implicitly or explicitly regarding

the current ability of each op erator to pro duce children of improved tness

Other metho ds do exist that adjust op erator setting based on other criteria

such as the diversity of the p opulation for example Coyne and Paton

but these will not b e considered in this pap er

Adaptation metho ds can b e divided into two classes for a review of work

on adapting op erator settings see Tuson

 The direct enco ding of op erator probabilities into each memb er of the

p opulation allowing them to coevolve with the solution

 The use of a learningrule to adapt op erator probabilities according to

the quality of solutions generated by each op erator

The following terminology will b e used Each op erator available to the ge

netic algorithm has a probability of b eing red an operator probability This

study makes a distinction b etween this and any parameters asso ciated with a

given op erator henceforth an operator parameter For example a genetic algo

rithm could use uniform crossover op erator probability of the time along

with mutation of the time with the mutation op erator p ossessing a bitwise

mutation rate of op erator parameter The term operator setting will b e

taken to mean b oth of the terms ab ove

Adaptation by Coevolution

Op erator adaptation metho ds based on the coevolutionary metaphor also re

ferred to as Selfadaptation enco de the op erator settings onto each memb er

of the p opulation and allow them to evolve The rationale b ehind this is as

follows solutions which have enco ded op erator settings that tend to pro duce

tter children will survive longer and so the useful op erator settings will spread

through the p opulation The original work in this area originated from the

community see Back for a review The mutation

op erator in such algorithms involves changing each by a value taken from a

Gaussian distribution with a standard deviation describ ed by a gene elsewhere

on the chromosome this parameterises the amount of disruption that mu

tation pro duces when creating a child These op erator parameters are allowed

to evolve to suitable values Work extending this to adapting the mutation pa

rameter for more conventional genetic algorithm implementations has rep orted

some success Back Back in the sense that the mutation parame

ter was seen to adapt to the theoretical optimum for the theoretically tractable

simple problem b eing considered

In the study in this pap er the op erator probabilities are enco ded as oating

p oint numb ers on the range to with the constraint that the sum of the

op erator probabilities must b e equal to one An example is given in Figure

Op erator parameters were enco ded in a similar fashion but without the

constraint The meaning of a particular op erator parameter dep ends up on the

op erator it is asso ciated with For example in the case of parameterised uniform

crossover the enco ded parameter is taken to b e the probability that a given

gene in a child is from the second parent In the case of binary mutation the

parameter is scaled to a bitwise mutation probability of to l where l is the

length of the binary string The value of l was selected as it is appreciably

larger than the l which several authors agree is the optimal value for some

problems eg Muhlen b ein and SchlierkampVo osen

The op eration pro cess is then as follows

Select a parent on the basis of tness 1 0 0 10 0 1 1 0 1 0 0.1 0.5 0.4

Operator Probabilities

Part of String Encoding Candidate Solution

Figure Representing Op erator Probabilities

Extract the enco ded op erator probabilities from the parent

Use these op erator probabilities to determine sto chastically which op er

ator is used on the solution part of the string

Apply the chosen op erator to the solution part of the parent

Determine sto chastically using a coevolution crossover probability the

op erator coevolution crossover or mutation that will b e applied to the

enco ded op erator settings Then apply it

Renormalise so that the enco ded op erator probabilities sum to one re

normalisation is not required for the enco ded op erator parameters

Each of the exp eriments b elow can undergo coevolution on two levels One

approach uses an externally set coevolution crossover probability which sets

the probability of using the coevolution crossover op erator A higher level

approach enco des this onto the string also this is often termed metalearning

The exp eriments p erformed are describ ed b elow

Two typ es of coevolution op erators were investigated The rst typ e were

strongly disruptive in other words children tend to b e quite dierent from their

parents This is to see if the choice of coevolution op erators is imp ortant for

op erator adaptation The crossover op erator used in this case was the realco ded

3

variant of Random Resp ectful Recombination R Radclie For each

real co ded gene the value of the child was simply randomly chosen from the

interval b ounded by the values of the two parents The mutation op erator was

applied to each realco ded gene with a probability of n where n is the numb er

of which can b e mutated Mutating a realco ded gene simply involves

replacing the present value with a randomly chosen value b etween and

The alternative to this is to use weakly disruptive op erators the op erations

on the enco ded probabilities are designed so that the child pro duced is quite

similar to its parent In this case The crossover op erator used is parameterised

uniform crossover with parameter The mutation op erator simply changes

the value of each gene to a value b etween and given by a Gaussian distri

bution with mean equal to the value of that gene on the parent and a standard

deviation of

Finally the eect of adapting the op erator parameters was also investigated

This investigation used op erators of low disruption as this was the typ e of op

erator used in previous investigations The op erator probabilities in this case

were xed throughout the genetic algorithm run only the op erator parameters

evolved

LearningRule Metho ds

Another approach is to p erio dically measure the p erformance of the op erators

b eing used and utilise this information to p erio dically adjust the op erator set

tings accordingly Previous work in this area has adjusted the op erator proba

bilities according to p opulation statistics although there seems to b e no reason

in principle why this metho d could not b e applied at the level of individual

chromosomes

Three such techniques exemplify this approach The rst two b ear some

similarity in that they attempt to give credit to an op erator for pro ducing a go o d

child and also to the op erators that pro duced the childs ancestors This is in

order to credit op erators that although they may not pro duce particularly go o d

children set the scene for another op erator to make improvements rather like

the bucketbrigade algorithm often used in classier systems Goldb erg

The fourth technique which is the sub ject of this investigation and will b e

describ ed later do es not do this although there is no reason why such a feature

could not b e added if desired

The rst for these was devised by Davis Davis The algorithm out

lined app ears quite complicated at rst sight each solution in the p opulation

records who its parents were any credit it may have and the op erators that

created it A prop ortion of the credit for any improvements in this case com

pared to the ttest memb er of the p opulation made are added to the child

a prop ortion of the remainder to its parentss and so on for a set numb er

of generations Davis did not use his metho d online but instead to obtain a

nonadaptive timevarying schedule for later use this was found to improve

p erformance over a genetic algorithm with xed op erator probabilities

A similar technique that requires less b o okkeeping is provided by Julstrom

Julstrom Each memb er of the p opulation has a attached to it

depicting the op erators used to create it When a child of improved tness is

pro duced this tree is used to assign credit to each op erator A queue is also

maintained that records the op erators credit over the most recent chromosomes

Both metho ds p erio dically pro cess this information to adjust the op erator

probabilities in an appropriate fashion Exp eriments showed that b oth ap

proaches were able to adapt op erator probabilities accordingly

One ob jection is that these metho ds require a lot of additional b o okkeep

ing indeed no empirical evidence has b een given to justify the added complex

ity This pap er investigates a simpler learningrule metho d COBRA Cost

Op eratorBased Rate Adaptation Corne et al originally devised for

adapting op erator probabilities in timetabling problems

A description of how COBRA is implemented follows Given k op erators

o o let b t b e the benet this study used operator productivity the

1 k i

average increase in tness when a child was pro duced that was tter than its

parents over a set interval c t the cost a measure of the amount of com

i

putational eort used to evaluate a child and p t the probability of a given

i

op erator i b eing used at time t We then apply the following algorithm

The user decides on a set of xed probabilities p This can either b e by

i

exp eriment or the user could use the probabilities that were found to work

well for a genetic algorithm with xed probabilities there is no guarantee

however that these will b e appropriate when using COBRA

After G the gap b etween op erator probability readjustments evaluations

rank the op erators according their values of b c and assign the op erators

i i

their new probabilities according to their rank ie the highest probability

to the op erator with the highest value of b c

i i

Rep eat step every G evaluations

The variables in the adaptation metho d come from two sources rstly the

gap b etween op erator probability readjustments G and secondly the initial

op erator probabilities provided by the user The eect that these variables have

up on the genetic algorithm will b e investigated

Hybrid Metho ds

There is no reason why the two approaches describ ed here cannot b e combined

A hybrid of the b oth the coevolutionary and learning rule approaches was

prop osed in Whitley This b ears many similarities to the coevolution of

the crossover probability studied here however only one coevolution op erator

was used This op erator increased the crossover probability if the parent was

tter than the child and decreased the crossover probability otherwise

The Test Problems

In order to prop erly evaluate the eectiveness or otherwise of adaptation by

coevolution a set of test problems needs to b e chosen The rst memb er of the

test suite is a hard schedulingop erations research problem The other memb ers

have b een selected on account of their theoretical interest Each will b e briey

describ ed in turn

 The Flowshop Sequencing Problem

This is an imp ortant problem in Op erations Research In the owshop

sequencing or nmP C problem jobs are dispatched to a string of

max

machines joined in a serial fashion The task is to nd an ordering of

jobs for the owshop to pro cess so as to minimise the makespan the

time taken for the last of the jobs to b e completed One of the Taillard

Taillard b enchmark problems was used a b enchmark RNG seed of

was used to generate a completion times matrix for a owshop

with jobs and machines the b enchmark are available via anonymous

FTP from ORLibrary mscmgamsicacuk

 The Max Ones Problem

The simplest of the problems considered here for a string of binary digits

the tness of a given string is the numb er of ones the string contains The

aim is to obtain a string containing all ones A string length of was

used

 Goldb ergs Order Deceptive Problem

The problems that deception can present to a genetic algorithm has b een

wellstudied A classic problem in such work is the order tight deceptive

problem devised by Goldb erg Goldb erg et al The problem used

has a of length bits

 The Royal Road Problem

Work by Forrest and Mitchell Forrest and Mitchell provides the

seminal study of this problem The Royal Road function used in this

study R has a of length bits

 The Long Path Problem

Recent work Horn et al has presented a class of problems where

a genetic algorithm convincingly outp erformed a range of hillclimbing

algorithms The problem was designed to b e hard for hillclimb ers

b ecause of lo cal minima there are none but due to the extreme length

l2

of the path to the optimum the path length is prop ortional to This

study examined the Ro otPath function represented by a binary string

of length

Implementation

An unstructured p opulation was used Two p opulation mo dels were used as

part of this study in order to see if this has any eect up on the success or 1 3 6 5 4 2 7 8Crossover 1 3 6 7 1 5 7 8 Legalise 1 3 6 74 5 2 8

6 4 3 7 1 5 2 8 6 4 3 5 4 2 2 8 6 4 3 51 2 7 8

Figure The Mo died PMX Op erator

otherwise of op erator adaptation The rst p opulation mo del used was steady

state repro duction and a kil lworst replacement p olicy the second mo del was

generational replacement with elitism

In b oth cases in order to op erate on the p opulation a parent is selected

using rankbased selection The op erator to use is then selected according to

its probability of selection and if the op erator is crossover a second parent is

selected entirely at random The op erator is then applied to the parents to

pro duce the child

The representation used for the owshop sequencing problem was a p ermu

tation of the jobs to b e placed into the owline numb ered to n The crossover

op erator used was the Mo died PMX op erator Mott This op erator

p erforms twop oint crossover up on the two strings selected The repair pro ce

dure then analyses one string for duplicates when one is found it is replaced

by the rst duplicate in the second string This pro cess is rep eated until b oth

strings are legal Figure

Based on a study of this problem by Reeves Reeves shift mutation

was selected for the owshop sequencing problem This op erator randomly picks

an element and shifts it to another randomly selected p osition

For the remaining problems all binaryenco ded the representation used was

a string of binary digits The op erators used were binary bitip mutation

with bitwise mutation probability of ml where l was the length of the string

m has a default value of and parameterised uniform crossover default

parameter value of was used each gene is selected from either parent with

equal probability

Results

A large numb er of exp eriments were p erformed in this study to o many to give

here Therefore a summary of the results are given and the reader is directed

to Tuson for full results

Two measures of p erformance were used the quality of solution obtained

after evaluations and the numb er of evaluations after which improve

ments in solution quality were no longer obtained or evaluations

whichever was smaller This numb er of evaluations was chosen as preliminary

investigations showed that the GA p opulation had converged long b efore then

Problem Solution Quality Evaluations

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Best Results for a Generational GA with Fixed Op erator Probabilities

Problem Solution Quality Evaluations

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Best Results for a SteadyState GA with Fixed Op erator Probabilities

Fifty genetic algorithm runs were p erformed for b oth p opulation mo dels

with a p opulation size of and a rankbased selection pressure of A ttest

was applied in order to ascertain if any dierences were signicant

The Genetic Algorithm With Fixed Op erator Settings

The eect of varying crossover probability on a genetic algorithm with xed

op erator probabilities was investigated An exhaustive search was made of the

op erator probabilities a genetic algorithm was run for crossover probabilities

to with steps of This provides a b enchmark against which the

p erformance of a genetic algorithm using coevolution will b e compared

This exhaustive search of the crossover probability measured how sensitive

this op erator setting is to genetic algorithm p erformance This gave some indica

tion of how hard the genetic algorithm was to tune and allowed later comparison

of the tuning diculty when coevolution is used

The b est average results obtained for each problemp opulation mo del with

crossover probability extreme values are rarely useful are given in

Tables and For each entry the standard deviation is given in parentheses

and the crossover probability at which the sample was taken is given in square

brackets

When the trends in p erformance against crossover probability are examined

some general patterns were observed reected somewhat in Tables and

The choice of op erator probabilities app ears to dep end up on the problem to b e

solved the p opulation mo del interestingly the steadystate genetic algorithm

gave consistently b etter p erformance and the p erformance criterion b eing used

Problem Solution Quality Evaluations Reqd

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Coevolution with Strongly Disruptive Op erators Generational GA

Problem Solution Quality Evaluations Reqd

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Coevolution with Strongly Disruptive Op erators SteadyState GA

The nal p oint is illustrated by the deceptive problem a low crossover proba

bility gives high quality results a high crossover probability exchanges solution

quality for a higher sp eed of search

Coevolution With Strongly Disruptive Op erators

For each of the problems an exhaustive search was made of the op erator proba

bilities a genetic algorithm was run for coevolution crossover probabilities

to with steps of Exp eriments were also p erformed using metalearning

The b est average p erformance attained for each problemp opulation mo del

are given in Tables and with standard deviations given in parentheses

table entries indicate a signicant Throughout the rest of this pap er underlined

dierence in p erformance by ttest when compared against a tuned genetic

algorithm with xed op erator probabilities

The p erformance of the GA was found in all cases to b e insensitive to the

choice of coevolution crossover probability The eect of using metalearning

was found to b e insignicant

Comparison with a genetic algorithm with a xed crossover probability indi

cates a signicant drop in p erformance when coevolution is used in most cases

The cases in which p erformance remained comparable were those that had suit

able op erator probabilities of around or were not particularly sensitive to

op erator probability anyway

Why the drop in p erformance A p ossible reason could b e that the genetic

algorithm was not able to evolve a suitable crossover probability To see if this

was the case plots of the evolved crossover probability against the numb er of 0.75

0.7

0.65

0.6

0.55

0.5

0.45 CROSSOVER PROBABILITY 0.4

0.35

0.3

0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000

EVALUATIONS

Figure A Typical Plot When Strongly Disruptive Op erators Were Used

evaluations made so far were obtained the standard deviation of the genetic

algorithm p opulation at that time is given by the error bars The plot shown

in Figure for a generational GA attempting the max ones problem is typical

for all of the problems

It can b e clearly seen that the crossover probability remains around or near

to This is despite the fact that for many problems the suitable choice of

crossover probability lies away from This explains why p erformance is often

degraded The question is why is no adaptation observed

Plots of the operator productivities the average improvement in tness

from parent to child that a given op erator pro duces were generated for a

conventional genetic algorithm with crossover probability so to see which

op erator was providing the most improvements at a given stage of a genetic

algorithm run Example plots are given later in this pap er Figures and

In all cases crossover was consistently pro ducing the greater improvements from

parent to child

Two p ossible reasons for nonadaptation can b e prop osed First the dif

ferences in pro ductivity b etween op erators are not great enough to result in

sucient selection pressure up on the enco ded crossover probability Alterna

tively the op erators b eing used are to o disruptive destroying any useful

information that the genetic algorithm has found so far

Coevolution With Weakly Disruptive Op erators

The hyp othesis that adaptation was prevented from o ccurring b ecause of the

disruption caused by the op erators can b e tested by simply using less disruptive 1

0.9

0.8

0.7

0.6

CROSSOVER PROBABILITY 0.5

0.4

0.3

0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000

EVALUATIONS

Figure A Typical Plot When Weakly Disruptive Op erators Were Used

op erators In some cases adaptation was seen to o ccur an example b eing for

the counting ones problem when a steady state mo del was used Figure

Adaptation was found to o ccur in the following cases owshop sequencing

and counting ones with a steadystate mo del only alb eit not reliably In the

other cases adaptation was not seen to o ccur This lends supp ort to the hyp oth

esis that disruptive op erators hinder adaptation However it is not clear why

adaptation was observed for those two problems but not the others esp ecially

as for all the problems considered here crossover was the op erator with the

highest pro ductivity

What eect do es this have up on p erformance As b efore an exhaustive

search was made a genetic algorithm was run for coevolution crossover prob

abilities to with steps of Exp eriments were also p erformed using

metalearning The results obtained are given in Tables and The p er

formance of the genetic algorithm was found in all cases to b e insensitive to

the choice of coevolution crossover probability As b efore the eect of using

metalearning was found to b e insignicant

Performance was still found to b e degraded when compared to a genetic al

gorithm with xed op erator probabilities even for the cases for which adap

tation was observed Examination of Figure suggests a p ossible reason It

takes time to evolve a suitable crossover probability to o long in fact By

the time that the genetic algorithm has found a go o d crossover probability

the p opulation is quite close to the optimum anyway allowing little time for

any p ositive impact on p erformance to b e made This requires further investi

gation Observing the eect up on genetic algorithm p erformance of initialising

Problem Solution Quality Evaluations Reqd

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Coevolution With Weakly Disruptive Op erators Generational GA

Problem Solution Quality Evaluations Reqd

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Coevolution with Weakly Disruptive Op erators SteadyState GA

the crossover probability closer to a known suitable value may provide useful

information

Coevolution of Op erator Parameters

In the literature thus far coevolution has b een used to adapt op erator param

eters sp ecically the mutation parameter To this end an investigation of

the eect of evolving the op erator parameters whilst the crossover probability

remained xed was p erformed As b efore an exhaustive search was made a

genetic algorithm was run for coevolution crossover probability to with

steps of allowing the op erator parameters to evolve The results obtained

for a genetic algorithm with xed crossover probability and default op erator pa

rameters are given in Tables and the crossover probabilities used are given

in square brackets these will b e used as a basis for comparison

Problem pXover Solution Quality Evaluations

Max Ones

Deceptive

Royal Road

Long Path

Table The Fixed Op erator Probabilities Generational Used for Comparison

The results discussed earlier caution towards examining whether adaptation

Problem pXover Solution Quality Evaluations

Max Ones

Deceptive

Royal Road

Long Path

Table The Fixed Op erator Probabilities SteadyState Used for Comparison

4.0

3.5

3.0

2.5

2.0

1.5 MUTATION PARAMETER 1.0

0.5

0

0 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000

EVALUATIONS

Figure A Plot of Mutation Parameter For The Counting Ones Problem

do es take place Therefore plots of the evolved crossover probability against the

numb er of evaluations made so far were obtained Figure

Adaptation do es take place for b oth the counting ones problem to l

the theoretical optimum Back and the long path problem However

this was only observed for a generational genetic algorithm It is not apparent

why further investigation is needed

No adaptation was observed at all for either the deceptive or royal road

problems In the case of the deceptive problem this may b e due to the fact

that the theoretically optimum value for the mutation parameter l lies in

the middle of the range of the enco ded mutation parameter An exp eriment

where the mutation parameter is initialised towards one end of the range should

resolve this question For the royal road problem adaptation may b e made

dicult by the stepwise tness function a result of which would b e to make

information on mutation parameter p erformance intermittent

The p erformance of the genetic algorithm was found in all cases to b e

insensitive to the choice of coevolution crossover probability When the quality

Problem Solution Quality Evaluations Reqd

Max Ones

Deceptive

Royal Road

Long Path

Table Coevolution of Op erator Parameters with a Generational GA

Problem Solution Quality Evaluations Reqd

Max Ones

Deceptive

Royal Road

Long Path

Table Coevolution of Op erator Parameters with a SteadyState GA

of results Tables and is examined for the cases for which adaptation to ok

place genetic algorithm p erformance was found to b e degraded As suggested

earlier this may b e a result of the time it takes for the genetic algorithm to

adapt go o d choices of the op erator parameters are more imp ortant earlier in

the genetic algorithm run than later

The sp eed to solution for the deceptive problem was improved in this case

presumably due to the mutation parameter lying initially close to the theoreti

cally optimal value l In the case of the royal road problem an exchange of

decreased solution quality in favour of increased sp eed of search was observed

an eect of the increased mutation parameter

In any case a thorough study of the eect of the op erator parameters up on

genetic algorithm p erformance may resolve many of the questions raised here

The Eectiveness of COBRA as a Function of The

Probability ReRanking Interval

The previous study of COBRA Corne et al did not investigate the eect

of varying G the gap b etween reranking the op erator probabilities This

investigation examined a range of values of G from to evaluations

in steps of The mean p erformance for each of these problems is given in

Tables and

For three of the problems considered here the choice of G had no signicant

eect up on genetic algorithm p erformance The exceptions were the deceptive

and long path problems

To interpret these results it is necessary for us to know what COBRA is

doing during the course of a genetic algorithm run Therefore plots of the

Mean Solution Quality With ReRanking Interval G

SteadyState

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Generational

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Solution Quality With ReRanking Interval G

operator productivities were then obtained for a conventional genetic algorithm

with crossover probability to see which op erator was providing the most

improvements at a given stage of a genetic algorithm run The plots shown

Figures and tend to b e typical for the problems considered here crossover

is consistently the more pro ductive op erator

Mean Evaluations Reqd With ReRanking Interval G

SteadyState

Flowshop

Max Ones

Royal Road

Long Path

Generational

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Evaluations Required With ReRanking Interval G

In the case of the deceptive problem a tradeo was observed low values

of G corresp onding to an increased sp eed of search although at the exp ense

of solution quality high values of G favouring a higher quality of solution

The reason for this b ehaviour is closely linked to the eect of the crossover

probability for the static genetic algorithm high crossover probabilities lead

the genetic algorithm to the deceptive optimum more quickly leading to a faster

sp eed of search whilst sacricing quality As crossover was consistently the

more pro ductive op erator the genetic algorithm will assume a high crossover

probability For low G this will o ccur earlier in a genetic algorithm run and

favour sp eed over quality

7.50 Crossover Crossover 18.00 7.00 Mutation 6.50 Mutation 16.00 GENERATIONAL 6.00

5.50 STEADY-STATE 14.00 5.00 12.00 4.50

4.00 10.00 3.50 8.00 3.00 Operator Productivity Operator Productivity 2.50 6.00 2.00

1.50 4.00

1.00 2.00 0.50

0.00 0.00

0 Evaluations 10000 0 Evaluations 10000

Figure Op erator Pro ductivities For The Deceptive Problem

1.40 1.30 Crossover Crossover 1.30 1.20

1.20 Mutation 1.10 Mutation

1.10 1.00

1.00 0.90 0.90 0.80 0.80 0.70 0.70 STEADY-STATE GENERATIONAL 0.60 0.60 0.50 0.50 Operator Productivity Operator Productivity 0.40 0.40

0.30 0.30

0.20 0.20

0.10 0.10

0.00 0.00

0 Evaluations 10000 0 Evaluations 10000

Figure Op erator Pro ductivities For The Long Path Problem

For the long path problem b ehaviour varied according to the p opulation

mo del used When a generational mo del was used solution quality was aected

rising with increased G Presumably the larger samples prevent spurious re

ranking due to noise in the op erator pro ductivity information

The trends observed with a steadystate genetic algorithm were quite the

opp osite solution quality was degraded when compared to a conventional ge

Generational GA SteadyState GA

Problem Static GA GA wCOBRA Static GA GA wCOBRA

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Obtained with Crossover Probabilities in Range

Generational GA SteadyState GA

Problem Static GA GA wCOBRA Static GA GA wCOBRA

Flowshop

Max Ones

Deceptive

Royal Road

Long Path

Table Evaluations Reqd with Crossover Probabilities in Range

netic algorithm declining further with increasing G This is a result of crossover

b eing the more pro ductive op erator most of the time Figure Unfortunately

the preferred crossover probability is low Therefore COBRA will mostly

adopt a high crossover probability with the eect of degrading solution quality

However when G is small there is a greater chance that spurious rerankings

to a lower ie b etter crossover probability will o ccur which reduces the adverse

eect of COBRA somewhat

The Eectiveness of COBRA as a Function of The

Initial Op erator Probabilities

The relationship b etween the p erformance of COBRA and the initial crossover

probability was investigated For each of the problems the p erformance of

crossover probabilities from to in steps of with the exception of

was examined Tables and display the range of p erformance attained

for all problems and p opulation mo dels with and without the use of COBRA

It was apparent in most cases that COBRA was less sensitive to the initial

crossover probability than a conventional genetic algorithm COBRA app ears

to mitigate the eect of bad choices somewhat

But do es this aect p erformance The b est p erformance attainable app ears

in most cases to b e unaected by COBRA The exceptions are the deceptive

problem for b oth p opulation mo dels and the long path problem Each case

will b e discussed in turn

In the case of the deceptive problem the b est solution quality attained is

reduced but with a corresp onding increase in sp eed This trade o was found to

b e controllable by the initial crossover probability The reason for this is similar

to the reason for the trends for G the higher pro ductivity of crossover means

that a high crossover probability is adopted which favours sp eed of search at

the exp ense of quality

The results for the long path problem however are disapp ointing When a

steadystate mo del was used solution quality was signicantly degraded The

reason for this is as for the trend in G largely due to the predominantly higher

crossover pro ductivity Figure This results in COBRA adopting a high

crossover probability hence eecting a reduction in solution quality

Conclusions

In general it was established by examining the eect of the crossover probability

on a conventional genetic algorithm that the choice of crossover probability

is dep endant up on the problem to b e solved the p opulation mo del and the

p erformance measure used

Conclusions will b e discussed for each approach separately with some nal

remarks to place them in a wider context

The Coevolutionary Approach

The choice of coevolution op erators was found to have a dramatic eect dis

ruptive op erators were found to remove the ability to adapt as they destroy

any information gained by selection The use of op erators of low disruption

improved matters somewhat but the o ccurrence of adaptation was found to b e

problem dep endent and unreliable

Unsurprisingly when no adaptation was found to take place the eect up on

p erformance was often found to b e detrimental However p erformance was seen

to decline even when adaptation to ok place Part of the reason for this is that

it takes time for the crossover probability to evolve to the right value by which

time much of the useful search has already b een p erformed and the impact of the

evolved crossover probability is much reduced getting the op erator probabil

ities right at the start of the genetic algorithm run app ears to b e imp ortant No

trends in the externally set coevolution crossover probability were found but

as adaptation do es not o ccur reliably if at all this is not particularly surprising

Enco ding of op erator parameters was found to b e more successful adapta

tion was observed more often however p erformance was still found to b e de

graded the time taken to adapt precludes ecient search at the early stages

of the genetic algorithm run Genetic algorithm p erformance was found to b e

sensitive to the op erator parameters used

It would app ear that at least for the problems used here adaptation by

coevolution of op erator settings is of little practical use However much of

the previous work eg Back et al is more p ositive Why is there a

dierence

First virtually all of the work p erformed previously examined the adaptation

of op erator parameters The results obtained for op erator parameters in this

are more optimistic than for probabilities Second part of the reason also lies

in the emphasis that some of the work has placed up on adaptation b eing an end

in itself with the implicit assumption that adaptation is a go o d thing This

work strongly questions this assumption

Finally work by Hinterding lo oked at adapting op erator parameters

by coevolution The mutation parameter he adapted was the standard devi

ation of a Gaussian mutation for problems that have a search space of real

numb ered parameters in a similar fashion to the Evolution Strategy com

munity The pap er concluded that for many problems the adaptation of the

mutation parameter lead to improved genetic algorithm p erformance This sug

gests that binaryenco ded problems have less p otential for this approach than

realco ded problems further investigation is required to conrm this

Adaptation Using COBRA

It was established by examining the eect of the crossover probability on a con

ventional genetic algorithm that the choice of crossover probability is dep en

dent up on the problem to b e solved the p opulation mo del and the p erformance

measure used

Performance was found to b e relatively robust towards the gap b etween op

erator probability reranking most often a slight upward trend in p erformance

was observed as a result of the b etter sampling of the op erator pro ductivities

Instead it was found that the initial op erator probabilities were the main factors

aecting p erformance

No improvement in p erformance was found to o ccur when COBRA was

b eing used however the genetic algorithm was often made less sensitive to the

op erator probabilities provided when COBRA was used it app ears to reduce to

eect of bad choices which in some applications may b e useful it may well

b e easier than lo oking over a larger numb er of conventional genetic algorithm

runs to obtain equivalent p erformance Also the p erformance of COBRA on

the basis of the results for the deceptive problem may well b e a technique that

favours sp eed of search at the p ossible exp ense of quality

COBRA was found to b e detrimental for some problems such as long path

with a steadystate p opulation mo del This app ears to manifest itself in cases

where the pro ductivity of an op erator is high thus COBRA assigns it a high

probability but the preferred probability of the op erator is low In these cases

COBRA should not b e used

This suggests that op erator pro ductivity can b e a p o or basis with which to

assign op erator probabilities for a static genetic algorithm and p ossibly mis

leading for COBRA Op erator pro ductivity is but one factor to consider others

include the maintenance of diversity lost by pro cesses such as drift and prema

ture convergence by the mutation op erator

One class of problems are known to exhibit improved p erformance when CO

BRA is used timetabling problems In addition recent work Ross and Corne

has shed some light up on what makes timetabling problems easy or hard for

a genetic algorithm Therefore it would b e advantageous to use timetabling

problems as a testb ed by taking examples for which COBRA was found to b e

eective and mo difying them to see what eect it has

To summarise COBRA has promise as a means to remove some of the

parameter tuning that can b edevil genetic algorithm applications But like

many other such devices there is the danger of b eing misled into a p o or choice

of op erator probabilities by some problems

Final Points

Op erator adaptation was not found to b e as useful as the initial arguments had

promised but some utility was discovered in p erformance sensitivity and more

uses may arise with dierent problems than those considered here However

some results of practical imp ortance have b een found

 Adaptation is not necessarily a go o d thing

 Other factors apart from op erator pro ductivity are of imp ortance when

assigning op erator probabilities this requires some knowledge of the

problem what diculties it can p ose for a genetic algorithm and what is

trying to b e achieved

 For adaptation to o ccur reliably the adaptation mechanism should b e sep

arated from the main genetic algorithm and the information up on which

decisions are made should b e explicitly measured

 If improvements in p erformance o ccur they are likely to b e in sp eed of

search to the p ossible detriment of solution quality

It may b e useful to view the assignment of op erator probabilities as in

eect providing knowledge alb eit in an implicit form to the genetic algorithm

on how to solve the problem This is addition to the knowledge provided by

the representation op erators tness function etc This constitutes a form of

knowledge acquisition of which the choice of adaptation metho d is a part

Finally it is now b ecoming clear that dierent problems present diculties

to optimisation techniques for dierent reasons Thus it is imp ortant rather

than lo oking for magic bullets research instead turns to characterising the

dierent pitfalls that optimisers may face how they can b e overcome and how

these pitfalls can b e detected when tackling real world problems

Acknowledgements

Thanks to Dave Corne for his advice on COBRA Thanks also to the Engineering

and Physical Sciences Research Council for their supp ort of Andrew Tuson via

a studentship with reference

References

Altenb erg Altenb erg L The Evolution of Evolvability in Ge

netic Programming In Kinnear K E editor Advances in Genetic Program

ming MIT Press

Back Back T SelfAdaptation in Genetic Algorithms In Pro

ceedings of the st European Conference on Articial Life pages

MIT Press

Back Back T The Interaction of Mutation Rate

Selection and SelfAdaptation Within a Genetic Algorithm In

Manner and Manderick

Back Back T Optimal Mutation Rates In Genetic Search In

Forrest S editor Proceedings of the Fifth International Conference on Ge

netic Algorithms San Mateo Morgan Kaufmann

Back Back T Evolutionary Algorithms in Theory and Practice

Oxford University Press

Back et al Back T Homeister F and Schwefel H P A

Survey of Evolution Strategies In Proceedings of the Fourth International

Conference on Genetic Algorithms pages San Mateo Morgan Kauf

mann

Corne et al Corne D Ross P and Fang HL GA Research

Note Fast Practical Evolutionary Timetabling Technical rep ort Univer

sity of Edinburgh Department of Articial Intelligence

Coyne and Paton Coyne J and Paton R Genetic Algorithms

and Directed Adaptation In Fogarty T C editor Selected Papers AISB

Workshop on Evolutionary Computing Lecture Notes in

No pages Springer Verlag

Davis Davis L Adapting Op erator Probabilites in Genetic Al

gorithms In Schaer J D editor Proceedings of the Third International

Conference on Genetic Algorithms and their Applications pages San

Mateo Morgan Kaufmann

Davis Davis L editor Handbook of Genetic Algorithms New

York Van Nostrand Reinhold

Forrest and Mitchell Forrest S and Mitchell M Relative Build

ing Blo ck Fitness and the Building Blo ck Hyp othesis In Whitely L D ed

itor Foundations of Genetic Algorithms San Mateo Morgan Kaufmann

Goldb erg Goldb erg D Genetic Algorithms in Search Optimiza

tion Reading Addison Wesley

Goldb erg et al Goldb erg D Korb B and Deb K Messy

genetic algorithms Motivation analysis and rst results Complex Systems

Hesser and Manner Hesser J and Manner R Towards an op

timal mutation probability for genetic algorithms In Schwefel H P and

Manner R editors Paral lel Problem Solving from Nature Proceedings of

st Workshop PPSN volume of Lecture Notes in Computer Science

pages Dortmund Germany SpringerVerlag Berlin Germany

Hinterding Hinterding R Representation and Selfadaption in

Genetic Algorithms In Proceedings of the First KoreaAustralia Joint Work

shop on

Horn et al Horn J Goldb erg D E and Deb K Long Path

Problems In Davidor Y Schwefel HP and Manner R editors Paral lel

Problem Solving from Nature PPSN III pages Springer Verlag

Julstrom Julstrom B A What have you done for me lately

adapting op erator probabilities in a steadystate genetic algorithm In Es

helman L J editor Proceedings of the Sixth International Conference on

Genetic Algorithms pages San Francisco Ca Morgan Kaufmann

Manner and Manderick Manner R and Manderick B editors

Paral lel Problem Solving from Nature Elsevier Science Publisher BV

Mott Mott G F Optimising Flowshop Scheduling Through

Adaptive Genetic Algorithms Chemistry Part II Thesis Oxford University

Muhlen b ein Muhlen b ein H How Genetic Algorithms Really

Work Mutation Hillclimbing In Manner and Manderick pages

Muhlen b ein and SchlierkampVo osen Muhlen b ein H and Schlierkamp

Vo osen D Anaylsis of Selection Mutation and Recombination in

Genetic Algorithms In Banzhaf W and Eeckman F H editors Evolution

as a Computational Process pages Springer Verlag

Radclie Radclie N J Equivalence Class analysis of Genetic

Algorithms Complex Systems

Reeves Reeves C R A genetic algorithm for owshop sequenc

ing Computers Ops Res

Ross and Corne Ross P and Corne D The phase transi

tion niche for evolutionary algorithms in timetabling In Proceedings of the

First International Conference on the Theory and Practice of Automated

Timetabling Edinburgh Napier University

Sp ears and DeJong Sp ears W M and DeJong K A An Analy

sis of MultiPoint Crossover In Gregory J E editor Foundations of Genetic

Algorithms San Mateo Morgan Kaufmann

Taillard Taillard E Benchmarks for basic scheduling problems

European Journal of

Tuson Tuson A L Adapting Op erator Probabilities In Genetic

Algorithms Masters thesis Department of Articial Intelligence Univeristy

of Edinburgh

Whitley Whitley D Genetic for Neu

ro control Problems Machine Learning