Evolutionary Computation Intelligence Can Be Defined As the Capability of a System to Adapt Its Behaviour to Ever-Changing Aspects of Its Environmenenvironment.T
Total Page:16
File Type:pdf, Size:1020Kb
Topic 9 Can evolution be intelligent? The behaviour of an individual organism is an inductive inference about some yet unknown Evolutionary Computation Intelligence can be defined as the capability of a system to adapt its behaviour to ever-changing aspects of its environmenenvironment.t. If, over successive environment. According to Alan Turing, the form generations, the organism survives, we can say Introduction, or can evolution be or appearance of a system is irrelevant to its that this organism is capable of learning to predict intelligent? or appearance of a system is irrelevant to its intelligence. changes in its environment. Simulation of natural evolution The evolutionary approach is based on Evolutionary computation simulates evolution on a The evolutionary approach is based on Genetic algorithms computer. The result of such a simulation is a computational models of natural selection and genetics. We call them evolutionary Evolution strategies series of optimisation algorithms, usually based on a simple set of rules. Optimisation iteratively computation, an umbrella term that combines Genetic programming improves the quality of solutions until an optimal, genetic algorithms, evolution strategies and Summary or at least feasible, solution is found. genetic programming. Simulation of natural evolution Neo-Darwinism is based on processes of Evolution can be seen as a process leading to the On 1 July 1858, Charles Darwin presented his reproduction, mutation, competition and selection. maintenance of a population’s ability to survive theory of evolution before the Linnean Society of The power to reproduce appears to be an essential and reproduce in a specific environment. This London. This day marks the beginning of a property of life. The power to mutate is also ability is called evolutionary fitness. revolution in biology. guaranteed in any living organism that reproduces Evolutionary fitness can also be viewed as a Darwin’s classical theory of evolution, together itself in a continuously changing environment. measure of the organism’s ability to anticipate with Weismann’s theory of natural selection and Processes of competition and selection normally changes in its environment. Mendel’s concept of genetics, now represent the take place in the natural world, where expanding The fitness, or the quantitative measure of the neo-Darwinian paradigm. populations of different species are limited by a ability to predict environmental changes and finite space. respond adequately, can be considered as the quality that is optimised in natural life. How is a population with increasing Simulation of natural evolution Genetic Algorithms fitness generated? In the early 1970s, John Holland introduced the All methods of evolutionary computation simulate Let us consider a population of rabbits. Some concept of genetic algorithms. rabbits are faster than others, and we may say that natural evolution by creating a population of these rabbits possess superior fitness, because they individuals, evaluating their fitness, generating a His aim was to make computers do what nature have a greater chance of avoiding foxes, surviving new population through genetic operations, and does. Holland was concerned with algorithms and then breeding. repeating this process a number of times. that manipulate strings of binary digits. If two parents have superior fitness, there is a good We will start with Genetic Algorithms (GAs) as Each artificial “chromosome” consists of a chance that a combination of their genes will most of the other evolutionary algorithms can be number of “genes”, and each gene is represented produce an offspring with even higher fitness. viewed as variations of genetic algorithms. by 0 or 1: Over time the entire population of rabbits becomes 1 0 1 1 0 1 0 0 0 0 0 1 0 1 0 1 faster to meet their environmental challenges in the face of foxes. Basic genetic algorithms Step 3: Randomly generate an initial population of Nature has an ability to adapt and learn without Step 3: Randomly generate an initial population of being told what to do. In other words, nature Step 1: Represent the problem variable domain as chromosomes of size N: finds good chromosomes blindly. GAs do the a chromosome of a fixed length, choose the size x1, x2, . , xN same. Two mechanisms link a GA to the problem of a chromosome population N, the crossover Step 4: Calculate the fitness of each individual it is solving: encoding and evaluation. probability pc and the mutation probability pm. chromosome: f (x ), f (x ), . , f (x ) The GA uses a measure of fitness of individual Step 2: Define a fitness function to measure the 1 2 N chromosomes to carry out reproduction. As performance, or fitness, of an individual Step 5: Select a pair of chromosomes for mating reproduction takes place, the crossover operator chromosome in the problem domain. The fitness from the current population. Parent exchanges parts of two single chromosomes, and function establishes the basis for selecting chromosomes are selected with a probability the mutation operator changes the gene value in chromosomes that will be mated during related to their fitness. some randomly chosen location of the reproduction. chromosome. Step 6: Create a pair of offspring chromosomes by Genetic algorithms Genetic algorithms: case study applying the genetic operators − crossover and GA represents an iterative process. Each iteration is A simple example will help us to understand how mutation. called a generation. A typical number of generations called a generation. A typical number of generations a GA works. Let us find the maximum value of for a simple GA can range from 50 to over 500. The Step 7: Place the created offspring chromosomes the function (15x − x2) where parameter x varies entire set of generations is called a run. in the new population. between 0 and 15. For simplicity, we may Step 8: Repeat Step 5 until the size of the new Because GAs use a stochastic search method, the assume that x takes only integer values. Thus, chromosome population becomes equal to the fitness of a population may remain stable for a chromosomes can be built with only four genes: size of the initial population, N. number of generations before a superior chromosome appears. Integer Binary code Integer Binary code Integer Binary code Step 9: Replace the initial (parent) chromosome 1 0 0 0 1 6 0 1 1 0 11 1 0 1 1 A common practice is to terminate a GA after a population with the new (offspring) population. 2 0 0 1 0 7 0 1 1 1 12 1 1 0 0 specified number of generations and then examine 3 0 0 1 1 8 1 0 0 0 13 1 1 0 1 Step 10: Go to Step 4, and repeat the process until the best chromosomes in the population. If no 4 0 1 0 0 9 1 0 0 1 14 1 1 1 0 5 0 1 0 1 10 1 0 1 0 15 1 1 1 1 the termination criterion is satisfied. satisfactory solution is found, the GA is restarted. The fitness function and chromosome locations In natural selection, only the fittest species can Chromosome Chromosome Decoded Chromosome Fitness Suppose that the size of the chromosome population label string integer fitness ratio, % survive, breed, and thereby pass their genes on to the next generation. GAs use a similar approach, N is 6, the crossover probability pc equals 0.7, and X1 1 1 0 0 12 36 16.5 X2 0 1 0 0 4 44 20.2 but unlike nature, the size of the chromosome the mutation probability pm equals 0.001. The X3 0 0 0 1 1 14 6.4 fitness function in our example is defined by X4 1 1 1 0 14 14 6.4 population remains unchanged from one X5 0 1 1 1 7 56 25.7 X6 1 0 0 1 9 54 24.8 generation to the next. f(x) = 15 x − x2 f(x) 60 60 The last column in Table shows the ratio of the 50 50 individual chromosome’s fitness to the 40 40 30 30 population’s total fitness. This ratio determines 20 20 the chromosome’s chance of being selected for 10 10 mating. The chromosome’s average fitness 0 0 0 510150 51015 x x improves from one generation to the next. (a) Chromosome in it ia l locations. (b) Chromosome final locations. Roulette wheel selection Crossover operator The most commonly used chromosome selection First, the crossover operator randomly chooses a crossover point where two parent chromosomes techniques is the roulette wheel selection. In our example, we have an initial population of 6 crossover point where two parent chromosomes chromosomes. Thus, to establish the same “break”, and then exchanges the chromosome 100 0 population in the next generation, the roulette parts after that point. As a result, two new X1: 16.5% 16.5 wheel would be spun six times. offspring are created. X2: 20.2% If a pair of chromosomes does not cross over, Once a pair of parent chromosomes is selected, If a pair of chromosomes does not cross over, X3: 6.4% Once a pair of parent chromosomes is selected, 75.2 the crossover operator is applied. then the chromosome cloning takes place, and the X4: 6.4% offspring are created as exact copies of each X5: 25.3% parent. 36.7 X6: 24.8% 49.5 43.1 Crossover Mutation operator Mutation Mutation represents a change in the gene. X6'i 1 0 0 0 Mutation is a background operator. Its role is to X6 1 0 0 0 1 00 00 X2 i 0 1 i provide a guarantee that the search algorithm is X2'i 0 1 0 01 not trapped on a local optimum.