Evolutionary Computation „ Intelligence Can Be Defined As the Capability of a System to Adapt Its Behaviour to Ever-Changing Aspects of Its Environmenenvironment.T

Total Page:16

File Type:pdf, Size:1020Kb

Evolutionary Computation „ Intelligence Can Be Defined As the Capability of a System to Adapt Its Behaviour to Ever-Changing Aspects of Its Environmenenvironment.T Topic 9 Can evolution be intelligent? The behaviour of an individual organism is an inductive inference about some yet unknown Evolutionary Computation Intelligence can be defined as the capability of a system to adapt its behaviour to ever-changing aspects of its environmenenvironment.t. If, over successive environment. According to Alan Turing, the form generations, the organism survives, we can say Introduction, or can evolution be or appearance of a system is irrelevant to its that this organism is capable of learning to predict intelligent? or appearance of a system is irrelevant to its intelligence. changes in its environment. Simulation of natural evolution The evolutionary approach is based on Evolutionary computation simulates evolution on a The evolutionary approach is based on Genetic algorithms computer. The result of such a simulation is a computational models of natural selection and genetics. We call them evolutionary Evolution strategies series of optimisation algorithms, usually based on a simple set of rules. Optimisation iteratively computation, an umbrella term that combines Genetic programming improves the quality of solutions until an optimal, genetic algorithms, evolution strategies and Summary or at least feasible, solution is found. genetic programming. Simulation of natural evolution Neo-Darwinism is based on processes of Evolution can be seen as a process leading to the On 1 July 1858, Charles Darwin presented his reproduction, mutation, competition and selection. maintenance of a population’s ability to survive theory of evolution before the Linnean Society of The power to reproduce appears to be an essential and reproduce in a specific environment. This London. This day marks the beginning of a property of life. The power to mutate is also ability is called evolutionary fitness. revolution in biology. guaranteed in any living organism that reproduces Evolutionary fitness can also be viewed as a Darwin’s classical theory of evolution, together itself in a continuously changing environment. measure of the organism’s ability to anticipate with Weismann’s theory of natural selection and Processes of competition and selection normally changes in its environment. Mendel’s concept of genetics, now represent the take place in the natural world, where expanding The fitness, or the quantitative measure of the neo-Darwinian paradigm. populations of different species are limited by a ability to predict environmental changes and finite space. respond adequately, can be considered as the quality that is optimised in natural life. How is a population with increasing Simulation of natural evolution Genetic Algorithms fitness generated? In the early 1970s, John Holland introduced the All methods of evolutionary computation simulate Let us consider a population of rabbits. Some concept of genetic algorithms. rabbits are faster than others, and we may say that natural evolution by creating a population of these rabbits possess superior fitness, because they individuals, evaluating their fitness, generating a His aim was to make computers do what nature have a greater chance of avoiding foxes, surviving new population through genetic operations, and does. Holland was concerned with algorithms and then breeding. repeating this process a number of times. that manipulate strings of binary digits. If two parents have superior fitness, there is a good We will start with Genetic Algorithms (GAs) as Each artificial “chromosome” consists of a chance that a combination of their genes will most of the other evolutionary algorithms can be number of “genes”, and each gene is represented produce an offspring with even higher fitness. viewed as variations of genetic algorithms. by 0 or 1: Over time the entire population of rabbits becomes 1 0 1 1 0 1 0 0 0 0 0 1 0 1 0 1 faster to meet their environmental challenges in the face of foxes. Basic genetic algorithms Step 3: Randomly generate an initial population of Nature has an ability to adapt and learn without Step 3: Randomly generate an initial population of being told what to do. In other words, nature Step 1: Represent the problem variable domain as chromosomes of size N: finds good chromosomes blindly. GAs do the a chromosome of a fixed length, choose the size x1, x2, . , xN same. Two mechanisms link a GA to the problem of a chromosome population N, the crossover Step 4: Calculate the fitness of each individual it is solving: encoding and evaluation. probability pc and the mutation probability pm. chromosome: f (x ), f (x ), . , f (x ) The GA uses a measure of fitness of individual Step 2: Define a fitness function to measure the 1 2 N chromosomes to carry out reproduction. As performance, or fitness, of an individual Step 5: Select a pair of chromosomes for mating reproduction takes place, the crossover operator chromosome in the problem domain. The fitness from the current population. Parent exchanges parts of two single chromosomes, and function establishes the basis for selecting chromosomes are selected with a probability the mutation operator changes the gene value in chromosomes that will be mated during related to their fitness. some randomly chosen location of the reproduction. chromosome. Step 6: Create a pair of offspring chromosomes by Genetic algorithms Genetic algorithms: case study applying the genetic operators − crossover and GA represents an iterative process. Each iteration is A simple example will help us to understand how mutation. called a generation. A typical number of generations called a generation. A typical number of generations a GA works. Let us find the maximum value of for a simple GA can range from 50 to over 500. The Step 7: Place the created offspring chromosomes the function (15x − x2) where parameter x varies entire set of generations is called a run. in the new population. between 0 and 15. For simplicity, we may Step 8: Repeat Step 5 until the size of the new Because GAs use a stochastic search method, the assume that x takes only integer values. Thus, chromosome population becomes equal to the fitness of a population may remain stable for a chromosomes can be built with only four genes: size of the initial population, N. number of generations before a superior chromosome appears. Integer Binary code Integer Binary code Integer Binary code Step 9: Replace the initial (parent) chromosome 1 0 0 0 1 6 0 1 1 0 11 1 0 1 1 A common practice is to terminate a GA after a population with the new (offspring) population. 2 0 0 1 0 7 0 1 1 1 12 1 1 0 0 specified number of generations and then examine 3 0 0 1 1 8 1 0 0 0 13 1 1 0 1 Step 10: Go to Step 4, and repeat the process until the best chromosomes in the population. If no 4 0 1 0 0 9 1 0 0 1 14 1 1 1 0 5 0 1 0 1 10 1 0 1 0 15 1 1 1 1 the termination criterion is satisfied. satisfactory solution is found, the GA is restarted. The fitness function and chromosome locations In natural selection, only the fittest species can Chromosome Chromosome Decoded Chromosome Fitness Suppose that the size of the chromosome population label string integer fitness ratio, % survive, breed, and thereby pass their genes on to the next generation. GAs use a similar approach, N is 6, the crossover probability pc equals 0.7, and X1 1 1 0 0 12 36 16.5 X2 0 1 0 0 4 44 20.2 but unlike nature, the size of the chromosome the mutation probability pm equals 0.001. The X3 0 0 0 1 1 14 6.4 fitness function in our example is defined by X4 1 1 1 0 14 14 6.4 population remains unchanged from one X5 0 1 1 1 7 56 25.7 X6 1 0 0 1 9 54 24.8 generation to the next. f(x) = 15 x − x2 f(x) 60 60 The last column in Table shows the ratio of the 50 50 individual chromosome’s fitness to the 40 40 30 30 population’s total fitness. This ratio determines 20 20 the chromosome’s chance of being selected for 10 10 mating. The chromosome’s average fitness 0 0 0 510150 51015 x x improves from one generation to the next. (a) Chromosome in it ia l locations. (b) Chromosome final locations. Roulette wheel selection Crossover operator The most commonly used chromosome selection First, the crossover operator randomly chooses a crossover point where two parent chromosomes techniques is the roulette wheel selection. In our example, we have an initial population of 6 crossover point where two parent chromosomes chromosomes. Thus, to establish the same “break”, and then exchanges the chromosome 100 0 population in the next generation, the roulette parts after that point. As a result, two new X1: 16.5% 16.5 wheel would be spun six times. offspring are created. X2: 20.2% If a pair of chromosomes does not cross over, Once a pair of parent chromosomes is selected, If a pair of chromosomes does not cross over, X3: 6.4% Once a pair of parent chromosomes is selected, 75.2 the crossover operator is applied. then the chromosome cloning takes place, and the X4: 6.4% offspring are created as exact copies of each X5: 25.3% parent. 36.7 X6: 24.8% 49.5 43.1 Crossover Mutation operator Mutation Mutation represents a change in the gene. X6'i 1 0 0 0 Mutation is a background operator. Its role is to X6 1 0 0 0 1 00 00 X2 i 0 1 i provide a guarantee that the search algorithm is X2'i 0 1 0 01 not trapped on a local optimum.
Recommended publications
  • Metaheuristics1
    METAHEURISTICS1 Kenneth Sörensen University of Antwerp, Belgium Fred Glover University of Colorado and OptTek Systems, Inc., USA 1 Definition A metaheuristic is a high-level problem-independent algorithmic framework that provides a set of guidelines or strategies to develop heuristic optimization algorithms (Sörensen and Glover, To appear). Notable examples of metaheuristics include genetic/evolutionary algorithms, tabu search, simulated annealing, and ant colony optimization, although many more exist. A problem-specific implementation of a heuristic optimization algorithm according to the guidelines expressed in a metaheuristic framework is also referred to as a metaheuristic. The term was coined by Glover (1986) and combines the Greek prefix meta- (metá, beyond in the sense of high-level) with heuristic (from the Greek heuriskein or euriskein, to search). Metaheuristic algorithms, i.e., optimization methods designed according to the strategies laid out in a metaheuristic framework, are — as the name suggests — always heuristic in nature. This fact distinguishes them from exact methods, that do come with a proof that the optimal solution will be found in a finite (although often prohibitively large) amount of time. Metaheuristics are therefore developed specifically to find a solution that is “good enough” in a computing time that is “small enough”. As a result, they are not subject to combinatorial explosion – the phenomenon where the computing time required to find the optimal solution of NP- hard problems increases as an exponential function of the problem size. Metaheuristics have been demonstrated by the scientific community to be a viable, and often superior, alternative to more traditional (exact) methods of mixed- integer optimization such as branch and bound and dynamic programming.
    [Show full text]
  • Genetic Programming: Theory, Implementation, and the Evolution of Unconstrained Solutions
    Genetic Programming: Theory, Implementation, and the Evolution of Unconstrained Solutions Alan Robinson Division III Thesis Committee: Lee Spector Hampshire College Jaime Davila May 2001 Mark Feinstein Contents Part I: Background 1 INTRODUCTION................................................................................................7 1.1 BACKGROUND – AUTOMATIC PROGRAMMING...................................................7 1.2 THIS PROJECT..................................................................................................8 1.3 SUMMARY OF CHAPTERS .................................................................................8 2 GENETIC PROGRAMMING REVIEW..........................................................11 2.1 WHAT IS GENETIC PROGRAMMING: A BRIEF OVERVIEW ...................................11 2.2 CONTEMPORARY GENETIC PROGRAMMING: IN DEPTH .....................................13 2.3 PREREQUISITE: A LANGUAGE AMENABLE TO (SEMI) RANDOM MODIFICATION ..13 2.4 STEPS SPECIFIC TO EACH PROBLEM.................................................................14 2.4.1 Create fitness function ..........................................................................14 2.4.2 Choose run parameters.........................................................................16 2.4.3 Select function / terminals.....................................................................17 2.5 THE GENETIC PROGRAMMING ALGORITHM IN ACTION .....................................18 2.5.1 Generate random population ................................................................18
    [Show full text]
  • A Hybrid LSTM-Based Genetic Programming Approach for Short-Term Prediction of Global Solar Radiation Using Weather Data
    processes Article A Hybrid LSTM-Based Genetic Programming Approach for Short-Term Prediction of Global Solar Radiation Using Weather Data Rami Al-Hajj 1,* , Ali Assi 2 , Mohamad Fouad 3 and Emad Mabrouk 1 1 College of Engineering and Technology, American University of the Middle East, Egaila 54200, Kuwait; [email protected] 2 Independent Researcher, Senior IEEE Member, Montreal, QC H1X1M4, Canada; [email protected] 3 Department of Computer Engineering, University of Mansoura, Mansoura 35516, Egypt; [email protected] * Correspondence: [email protected] or [email protected] Abstract: The integration of solar energy in smart grids and other utilities is continuously increasing due to its economic and environmental benefits. However, the uncertainty of available solar energy creates challenges regarding the stability of the generated power the supply-demand balance’s consistency. An accurate global solar radiation (GSR) prediction model can ensure overall system reliability and power generation scheduling. This article describes a nonlinear hybrid model based on Long Short-Term Memory (LSTM) models and the Genetic Programming technique for short-term prediction of global solar radiation. The LSTMs are Recurrent Neural Network (RNN) models that are successfully used to predict time-series data. We use these models as base predictors of GSR using weather and solar radiation (SR) data. Genetic programming (GP) is an evolutionary heuristic computing technique that enables automatic search for complex solution formulas. We use the GP Citation: Al-Hajj, R.; Assi, A.; Fouad, in a post-processing stage to combine the LSTM models’ outputs to find the best prediction of the M.; Mabrouk, E.
    [Show full text]
  • Long Term Memory Assistance for Evolutionary Algorithms
    mathematics Article Long Term Memory Assistance for Evolutionary Algorithms Matej Crepinšekˇ 1,* , Shih-Hsi Liu 2 , Marjan Mernik 1 and Miha Ravber 1 1 Faculty of Electrical Engineering and Computer Science, University of Maribor, 2000 Maribor, Slovenia; [email protected] (M.M.); [email protected] (M.R.) 2 Department of Computer Science, California State University Fresno, Fresno, CA 93740, USA; [email protected] * Correspondence: [email protected] Received: 7 September 2019; Accepted: 12 November 2019; Published: 18 November 2019 Abstract: Short term memory that records the current population has been an inherent component of Evolutionary Algorithms (EAs). As hardware technologies advance currently, inexpensive memory with massive capacities could become a performance boost to EAs. This paper introduces a Long Term Memory Assistance (LTMA) that records the entire search history of an evolutionary process. With LTMA, individuals already visited (i.e., duplicate solutions) do not need to be re-evaluated, and thus, resources originally designated to fitness evaluations could be reallocated to continue search space exploration or exploitation. Three sets of experiments were conducted to prove the superiority of LTMA. In the first experiment, it was shown that LTMA recorded at least 50% more duplicate individuals than a short term memory. In the second experiment, ABC and jDElscop were applied to the CEC-2015 benchmark functions. By avoiding fitness re-evaluation, LTMA improved execution time of the most time consuming problems F03 and F05 between 7% and 28% and 7% and 16%, respectively. In the third experiment, a hard real-world problem for determining soil models’ parameters, LTMA improved execution time between 26% and 69%.
    [Show full text]
  • Symbolic Computation Using Grammatical Evolution
    Symbolic Computation using Grammatical Evolution Alan Christianson Department of Mathematics and Computer Science South Dakota School of Mines and Technology Rapid City, SD 57701 [email protected] Jeff McGough Department of Mathematics and Computer Science South Dakota School of Mines and Technology Rapid City, SD 57701 [email protected] March 20, 2009 Abstract Evolutionary Algorithms have demonstrated results in a vast array of optimization prob- lems and are regularly employed in engineering design. However, many mathematical problems have shown traditional methods to be the more effective approach. In the case of symbolic computation, it may be difficult or not feasible to extend numerical approachs and thus leave the door open to other methods. In this paper, we study the application of a grammar-based approach in Evolutionary Com- puting known as Grammatical Evolution (GE) to a selected problem in control theory. Grammatical evolution is a variation of genetic programming. GE does not operate di- rectly on the expression but following a lead from nature indirectly through genome strings. These evolved strings are used to select production rules in a BNF grammar to generate algebraic expressions which are potential solutions to the problem at hand. Traditional approaches have been plagued by unrestrained expression growth, stagnation and lack of convergence. These are addressed by the more biologically realistic BNF representation and variations in the genetic operators. 1 Introduction Control Theory is a well established field which concerns itself with the modeling and reg- ulation of dynamical processes. The discipline brings robust mathematical tools to address many questions in process control.
    [Show full text]
  • Genetic Algorithm: Reviews, Implementations, and Applications
    Paper— Genetic Algorithm: Reviews, Implementation and Applications Genetic Algorithm: Reviews, Implementations, and Applications Tanweer Alam() Faculty of Computer and Information Systems, Islamic University of Madinah, Saudi Arabia [email protected] Shamimul Qamar Computer Engineering Department, King Khalid University, Abha, Saudi Arabia Amit Dixit Department of ECE, Quantum School of Technology, Roorkee, India Mohamed Benaida Faculty of Computer and Information Systems, Islamic University of Madinah, Saudi Arabia How to cite this article? Tanweer Alam. Shamimul Qamar. Amit Dixit. Mohamed Benaida. " Genetic Al- gorithm: Reviews, Implementations, and Applications.", International Journal of Engineering Pedagogy (iJEP). 2020. Abstract—Nowadays genetic algorithm (GA) is greatly used in engineering ped- agogy as an adaptive technique to learn and solve complex problems and issues. It is a meta-heuristic approach that is used to solve hybrid computation chal- lenges. GA utilizes selection, crossover, and mutation operators to effectively manage the searching system strategy. This algorithm is derived from natural se- lection and genetics concepts. GA is an intelligent use of random search sup- ported with historical data to contribute the search in an area of the improved outcome within a coverage framework. Such algorithms are widely used for maintaining high-quality reactions to optimize issues and problems investigation. These techniques are recognized to be somewhat of a statistical investigation pro- cess to search for a suitable solution or prevent an accurate strategy for challenges in optimization or searches. These techniques have been produced from natu- ral selection or genetics principles. For random testing, historical information is provided with intelligent enslavement to continue moving the search out from the area of improved features for processing of the outcomes.
    [Show full text]
  • Accelerating Real-Valued Genetic Algorithms Using Mutation-With-Momentum
    CORE Metadata, citation and similar papers at core.ac.uk Provided by University of Tasmania Open Access Repository Accelerating Real-Valued Genetic Algorithms Using Mutation-With-Momentum Luke Temby, Peter Vamplew and Adam Berry Technical Report 2005-02 School of Computing, University of Tasmania, Private Bag 100, Hobart This report is an extended version of a paper of the same title presented at AI'05: The 18th Australian Joint Conference on Artificial Intelligence, Sydney, Australia, 5-9 Dec 2005. Abstract: In a canonical genetic algorithm, the reproduction operators (crossover and mutation) are random in nature. The direction of the search carried out by the GA system is driven purely by the bias to fitter individuals in the selection process. Several authors have proposed the use of directed mutation operators as a means of improving the convergence speed of GAs on problems involving real-valued alleles. This paper proposes a new approach to directed mutation based on the momentum concept commonly used to accelerate the gradient descent training of neural networks. This mutation-with- momentum operator is compared against standard Gaussian mutation across a series of benchmark problems, and is shown to regularly result in rapid improvements in performance during the early generations of the GA. A hybrid system combining the momentum-based and standard mutation operators is shown to outperform either individual approach to mutation across all of the benchmarks. 1 Introduction In a traditional genetic algorithm (GA) using a bit-string representation, the mutation operator acts primarily to preserve diversity within the population, ensuring that alleles can not be permanently lost from the population.
    [Show full text]
  • A Genetic Programming-Based Low-Level Instructions Robot for Realtimebattle
    entropy Article A Genetic Programming-Based Low-Level Instructions Robot for Realtimebattle Juan Romero 1,2,* , Antonino Santos 3 , Adrian Carballal 1,3 , Nereida Rodriguez-Fernandez 1,2 , Iria Santos 1,2 , Alvaro Torrente-Patiño 3 , Juan Tuñas 3 and Penousal Machado 4 1 CITIC-Research Center of Information and Communication Technologies, University of A Coruña, 15071 A Coruña, Spain; [email protected] (A.C.); [email protected] (N.R.-F.); [email protected] (I.S.) 2 Department of Computer Science and Information Technologies, Faculty of Communication Science, University of A Coruña, Campus Elviña s/n, 15071 A Coruña, Spain 3 Department of Computer Science and Information Technologies, Faculty of Computer Science, University of A Coruña, Campus Elviña s/n, 15071 A Coruña, Spain; [email protected] (A.S.); [email protected] (A.T.-P.); [email protected] (J.T.) 4 Centre for Informatics and Systems of the University of Coimbra (CISUC), DEI, University of Coimbra, 3030-790 Coimbra, Portugal; [email protected] * Correspondence: [email protected] Received: 26 November 2020; Accepted: 30 November 2020; Published: 30 November 2020 Abstract: RealTimeBattle is an environment in which robots controlled by programs fight each other. Programs control the simulated robots using low-level messages (e.g., turn radar, accelerate). Unlike other tools like Robocode, each of these robots can be developed using different programming languages. Our purpose is to generate, without human programming or other intervention, a robot that is highly competitive in RealTimeBattle. To that end, we implemented an Evolutionary Computation technique: Genetic Programming.
    [Show full text]
  • Solving a Highly Multimodal Design Optimization Problem Using the Extended Genetic Algorithm GLEAM
    Solving a highly multimodal design optimization problem using the extended genetic algorithm GLEAM W. Jakob, M. Gorges-Schleuter, I. Sieber, W. Süß, H. Eggert Institute for Applied Computer Science, Forschungszentrum Karlsruhe, P.O. 3640, D-76021 Karlsruhe, Germany, EMail: [email protected] Abstract In the area of micro system design the usage of simulation and optimization must precede the production of specimen or test batches due to the expensive and time consuming nature of the production process itself. In this paper we report on the design optimization of a heterodyne receiver which is a detection module for opti- cal communication-systems. The collimating lens-system of the receiver is opti- mized with respect to the tolerances of the fabrication and assembly process as well as to the spherical aberrations of the lenses. It is shown that this is a highly multimodal problem which cannot be solved by traditional local hill climbing algorithms. For the applicability of more sophisticated search methods like our extended Genetic Algorithm GLEAM short runtimes for the simulation or a small amount of simulation runs is essential. Thus we tested a new approach, the so called optimization foreruns, the results of which are used for the initialization of the main optimization run. The promising results were checked by testing the approach with mathematical test functions known from literature. The surprising result was that most of these functions behave considerable different from our real world problems, which limits their usefulness drastically. 1 Introduction The production of specimen for microcomponents or microsystems is both, mate- rial and time consuming because of the sophisticated manufacturing techniques.
    [Show full text]
  • Arxiv:1606.00601V1 [Cs.NE] 2 Jun 2016 Keywords: Pr Solving Tasks
    On the performance of different mutation operators of a subpopulation-based genetic algorithm for multi-robot task allocation problems Chun Liua,b,∗, Andreas Krollb aSchool of Automation, Beijing University of Posts and Telecommunications No 10, Xitucheng Road, 100876, Beijing, China bDepartment of Measurement and Control, Mechanical Engineering, University of Kassel M¨onchebergstraße 7, 34125, Kassel, Germany Abstract The performance of different mutation operators is usually evaluated in conjunc- tion with specific parameter settings of genetic algorithms and target problems. Most studies focus on the classical genetic algorithm with different parameters or on solving unconstrained combinatorial optimization problems such as the traveling salesman problems. In this paper, a subpopulation-based genetic al- gorithm that uses only mutation and selection is developed to solve multi-robot task allocation problems. The target problems are constrained combinatorial optimization problems, and are more complex if cooperative tasks are involved as these introduce additional spatial and temporal constraints. The proposed genetic algorithm can obtain better solutions than classical genetic algorithms with tournament selection and partially mapped crossover. The performance of different mutation operators in solving problems without/with cooperative arXiv:1606.00601v1 [cs.NE] 2 Jun 2016 tasks is evaluated. The results imply that inversion mutation performs better than others when solving problems without cooperative tasks, and the swap- inversion combination performs better than others when solving problems with cooperative tasks. Keywords: constrained combinatorial optimization, genetic algorithm, ∗Corresponding author. Email addresses: [email protected] (Chun Liu), [email protected] (Andreas Kroll) August 13, 2018 mutation operators, subpopulation, multi-robot task allocation, inspection problems 1.
    [Show full text]
  • Computational Creativity: Three Generations of Research and Beyond
    Computational Creativity: Three Generations of Research and Beyond Debasis Mitra Department of Computer Science Florida Institute of Technology [email protected] Abstract In this article we have classified computational creativity research activities into three generations. Although the 2. Philosophical Angles respective system developers were not necessarily targeting Philosophers try to understand creativity from the their research for computational creativity, we consider their works as contribution to this emerging field. Possibly, the historical perspectives – how different acts of creativity first recognition of the implication of intelligent systems (primarily in science) might have happened. Historical toward the creativity came with an AAAI Spring investigation of the process involved in scientific discovery Symposium on AI and Creativity (Dartnall and Kim, 1993). relied heavily on philosophical viewpoints. Within We have here tried to chart the progress of the field by philosophy there is an ongoing old debate regarding describing some sample projects. Our hope is that this whether the process of scientific discovery has a normative article will provide some direction to the interested basis. Within the computing community this question researchers and help creating a vision for the community. transpires in asking if analyzing and computationally emulating creativity is feasible or not. In order to answer this question artificial intelligence (AI) researchers have 1. Introduction tried to develop computing systems to mimic scientific One of the meanings of the word “create” is “to produce by discovery processes (e.g., BACON, KEKADA, etc. that we imaginative skill” and that of the word “creativity” is “the will discuss), almost since the beginning of the inception of ability to create,’ according to the Webster Dictionary.
    [Show full text]
  • Geometric Semantic Genetic Programming Algorithm and Slump Prediction
    Geometric Semantic Genetic Programming Algorithm and Slump Prediction Juncai Xu1, Zhenzhong Shen1, Qingwen Ren1, Xin Xie2, and Zhengyu Yang2 1 College of Water Conservancy and Hydropower Engineering, Hohai University, Nanjing 210098, China 2 Department of Electrical and Engineering, Northeastern University, Boston, MA 02115, USA ABSTRACT Research on the performance of recycled concrete as building material in the current world is an important subject. Given the complex composition of recycled concrete, conventional methods for forecasting slump scarcely obtain satisfactory results. Based on theory of nonlinear prediction method, we propose a recycled concrete slump prediction model based on geometric semantic genetic programming (GSGP) and combined it with recycled concrete features. Tests show that the model can accurately predict the recycled concrete slump by using the established prediction model to calculate the recycled concrete slump with different mixing ratios in practical projects and by comparing the predicted values with the experimental values. By comparing the model with several other nonlinear prediction models, we can conclude that GSGP has higher accuracy and reliability than conventional methods. Keywords: recycled concrete; geometric semantics; genetic programming; slump 1. Introduction The rapid development of the construction industry has resulted in a huge demand for concrete, which, in turn, caused overexploitation of natural sand and gravel as well as serious damage to the ecological environment. Such demand produces a large amount of waste concrete in construction, entailing high costs for dealing with these wastes 1-3. In recent years, various properties of recycled concrete were validated by researchers from all over the world to protect the environment and reduce processing costs.
    [Show full text]