Grammatical Evolution for Neural Network Optimization in the Control System Synthesis Problem D.E

Total Page:16

File Type:pdf, Size:1020Kb

Grammatical Evolution for Neural Network Optimization in the Control System Synthesis Problem D.E Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 103 ( 2017 ) 14 – 19 XIIth International Symposium Intelligent Systems, INTELS16, 5-7 October 2016, Moscow, Russia Grammatical evolution for neural network optimization in the control system synthesis problem D.E. Kazaryan∗, A.V. Savinkov RUDN University, Miklukho-Maklaya str. 6, Moscow 117198, Russia Abstract Grammatical evolution is a perspective branch of the genetic programming. It uses evolutionary algorithm based search engine and Backus — Naur form of domain-specific language grammar specifications to find symbolic expressions. This paper describes an application of this method to the control function synthesis problem. Feed-forward neural network was used as an approximation of the control function, that depends on the object state variables. Two-stage algorithm is presented: grammatical evolution optimizes neural network structure and genetic algorithm tunes weights. Computational experiments were performed on the simple kinematic model of a two-wheel driving mobile robot. Training was performed on a set of initial conditions. Results show that the proposed algorithm is able to successfully synthesize a control function. ©c 20172017 TheThe Authors. Authors. Published Published by byElsevier Elsevier B.V. B.V. This is an open access article under the CC BY-NC-ND license Peer-revie(http://creativecommons.org/licenses/by-nc-nd/4.0/w under responsibility of the scientific). committee of the XIIth International Symposium “Intelligent Systems”. Peer-review under responsibility of the scientific committee of the XIIth International Symposium “Intelligent Systems” Keywords: grammatical evolution; control system synthesis; artificial neural networks. 1. Introduction Control synthesis is a complex problem, that usually involves a great amount of analytical computations, especially for the nonlinear problems, or comprised of tedious uniform tasks. For nonlinear problems common approach is to linearize a plant around operating points and then design a linear controller. This approach could lead to oversimpli- fication of the model. These issues will be avoided if computer take the brunt of a solution search. Artificial neural networks (ANNs) are often used in control applications as they are developed theoretically as well as have computationally effective implementations. There were successful attempts to use ANN as a nonlinear controller, that takes the plant state variables as inputs and produce control signal 1,2. Arguably the most widely used model of ANNs in control applications is nonlinear autoregressive model with exogenous inputs (NARX), that uses delayed inputs and outputs: h(k) = F y(k − 1), y(k − 2),...,y(k − n), u(k), u(k − 1),...,u(k − m) . (1) ∗ Corresponding author. Tel.: +7-495-955-0792. E-mail address: kazaryan [email protected] 1877-0509 © 2017 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review under responsibility of the scientific committee of the XIIth International Symposium “Intelligent Systems” doi: 10.1016/j.procs.2017.01.002 D.E. Kazaryan and A.V. Savinkov / Procedia Computer Science 103 ( 2017 ) 14 – 19 15 NARX networks are used both for identification 3,4,5 and control 3. Usually control engineers select the ANN architec- ture using their expert knowledge in the problem domain. In this paper we propose to use symbolic expression search method for optimal ANN structure search. Symbolic expressions search methods arose from the work 6 that introduced genetic programming method. Genetic programming is followed by cartesian genetic programming 7, grammatical evolution (GE) 8, analytic programming 9, network operator 10 etc. These methods allow to perform a search for a structure using functions as ”building blocks”. Symbolic expressions search methods are actively used in control synthesis recently 11,12 We choose grammatical evolution (GE) for optimal ANN structure 2 search. There are several works in this field 13,14. In our work we define grammatical rules of modification of some existing neural network structure. We supposed that incorporating expert knowledge into the search process is important and introduced elements of basic solution principle 11. The paper is organized as following. In section 2 control synthesis problem is formally stated. In section 3 we describe neural controller model used in the paper. Section 4 introduces the structure of the GE algorithm. In sec- tion 5 we show the neural controller performance on the simple nonlinear mobile robot kinematic model. Conclusion summarizes the paper and discusses possible directions of the future work. 2. Control system synthesis problem statement Consider the following ODE system x˙(t) = f (x (t) , u (t)) , (2) where x ∈ X is a system state, X ∈ Rn, u ∈ U is a control function, U ∈ Rm is closed and bounded, m < n. u(t) is defined on t0; t f . Initial conditions for (2) are given as X0 ∈ X and target terminal states set is defined as X f ∈ X, X0 ∩ X f = ∅. Generally X0 and X f are continuous, but our assumption is that if we synthesized a control ∀x(0) ∈ D ∀x( f ) ∈ D D ∈ D ∈ D D function that could move the system (2) from i X0 to j X f , X0 X0, X f X f , X0 and X f are finite and ∀x(0) ∈ given with small enough discretization step, the control function would be able to move the system (2) from i X0 ∀x( f ) ∈ to j X f : D = x(0) , x(0) ,...,x(0) , X0 0 (t0) 1 (t0) c−1(t0) (3) D = x( f ) , x( f ) ,...,x( f ) . X f 0 (t f ) 1 (t f ) d−1(t f ) (4) Control synthesis goal is to find a function h , x, x(0), x( f ) ∈ (t i j ) U (5) x(0) ∈ D x( f ) ∈ D 15 that moves the system (2) from i X0 to j X f while minimizing a set of functionals in a Pareto sense p. 25 in general: J = {J1(h), J2(h),...,Jr(h)} , (6) where = h , x, x(0), x( f ) , = , . Jk F( (t i j )) k 1 r (7) i j h , x, x(0), x( f ) This general problem statement can always be reduced to the single functional optimization problem. As (t i j ) is nonlinear, we can approximate it with an artificial neural network. ANN architecture will be described in the next section. 16 D.E. Kazaryan and A.V. Savinkov / Procedia Computer Science 103 ( 2017 ) 14 – 19 Fig. 1. Control system with neural controller 3. Neural controller model Artificial neural networks are great approximators. It has been proven 16 that even neural network with single hidden layer and discriminatory activation functions and suffi cient number of parameters (weights) is able to approximate any nonlinear function with required accuracy. In this paper we will use this ANN ability to approximate the desired function (5). ANN for the control system architecture shown on Figure 1 can be expressed as a function hˆ , x, x(0), x( f ), w = f w ,... w , f w , f w , , x, x(0), x( f ) , t i j n n 3 2 2 1 1 t i j (8) where wi, i = 0, n − 1 are subject to the parametric optimization whereas n, pi = dim wi and fi ∈ F are subjects to the structural optimization, F is the ordered set of allowed activation functions. In this paper we used fully-connected layers for ANN. 4. Grammatical evolution for neural controller synthesis Grammatical evolution is an evolutionary algorithm that uses formal grammar rules given in a Backus — Naur form (BNF). BNF is a way for defining the language grammar in the form of production rules. Rules are formed using terminals and nonterminals. Terminals are elements of the language and nonterminals are expressions that can be replaced using production rules to other nonterminals, terminals or their combination. Using these rules GE builds possible problem solutions in a string form. The string obtained is a subject to evaluation. Usually during evaluation it should be translated or interpreted, so it is preferable to use a programming language that has built-in eval operator. For the search process GE uses a search engine, usually genetic algorithm 8 or particle swarm optimization algo- rithm 17. Search algorithm operates over a population of integer or binary arrays of variable length. During the search these arrays are transformed using operators specific to the certain algorithm. In our work genetic algorithm is used, so crossover and mutation operators are chosen. GE requires several sets to be defined: N – set of nonterminals, T – set of terminals, S – set of possible initial symbols (usually S ∈ N, but it is possible to use a predefined string, that contains at least one nonterminal), P – set of production rules. GE can be easily adopted for the structural optimization of the neural network (8). Let us define sets described above: N = {<expr>, <modification>, <f_num>, <l_num>, <n_num>} T = {0, 1, ..., 9, add_l, add_n, rmv_l, rmv_n, chng_f, max_l, max_n, max_f} S = {<expr>} and P can be represented as (1) <expr> ::= <expr>,<modification>,<expr> (0) | <modification> (1) D.E. Kazaryan and A.V. Savinkov / Procedia Computer Science 103 ( 2017 ) 14 – 19 17 Fig. 2. Initial neural network structure for grammatical evolution (2) <modification> ::= add_l(<l_num>, <n_num>) (0) | add_n(<l_num>, <n_num>) (1) | rmv_l(<l_num>) (2) | rmv_n(<l_num>, <n_num>) (3) | chng_f(<l_num>, <f_num>) (4) (3) <l_num> ::= 0|1|...|max_l-1 (0)-(max_l-1) (4) <n_num> ::= 0|1|...|max_n-1 (0)-(max_n-1) (5) <f_num> ::= 0|1|...|max_f-1 (0)-(max_f-1) where <modification> options are functions that change the structure of the ANN, <l num> is a layer position, <n num> is a number of neurons, and <f num> is an index of the activation function in F. max l, max n, and max f define the greatest layer index, the greatest number of neurons in a layer and the greatest activation function index respectively.
Recommended publications
  • Constituent Grammatical Evolution
    Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence Constituent Grammatical Evolution Loukas Georgiou and William J. Teahan School of Computer Science, Bangor University Bangor, Wales [email protected] and [email protected] Abstract Grammatical Evolution’s unique features compared to other evolutionary algorithms are the degenerate genetic code We present Constituent Grammatical Evolution which facilitates the occurrence of neutral mutations (vari- (CGE), a new evolutionary automatic program- ous genotypes can represent the same phenotype), and the ming algorithm that extends the standard Gram- wrapping of the genotype during the mapping process which matical Evolution algorithm by incorporating the enables the reuse of the same genotype for the production of concepts of constituent genes and conditional be- different phenotypes. haviour-switching. CGE builds from elementary Grammatical Evolution (GE) takes inspiration from na- and more complex building blocks a control pro- ture. It embraces the developmental approach and draws gram which dictates the behaviour of an agent and upon principles that allow an abstract representation of a it is applicable to the class of problems where the program to be evolved. This abstraction enables the follow- subject of search is the behaviour of an agent in a ing: the separation of the search and solution spaces; the given environment. It takes advantage of the pow- evolution of programs in an arbitrary language; the exis- erful Grammatical Evolution feature of using a tence of degenerate genetic code; and the wrapping that BNF grammar definition as a plug-in component to allows the reuse of the genetic material.
    [Show full text]
  • Automated Design of Genetic Programming Classification Algorithms
    Automated Design of Genetic Programming Classification Algorithms Thambo Nyathi Supervisor: Prof. Nelishia Pillay This thesis is submitted in fulfilment of the academic requirements of Doctor of Philosophy in Computer Science School of Mathematics , Statistics and Computer Science University of KwaZulu Natal Pietermaritzburg South Africa December 2018 PREFACE The research contained in this thesis was completed by the candidate while based in the Discipline of Computer Science, School of Mathematics, Statistics and Computer Science of the College of Agriculture, Engineering and Science, University of KwaZulu-Natal, Pietermaritzburg, South Africa. The contents of this work have not been submitted in any form to another university and, except where the work of others is acknowledged in the text, the results reported are due to investigations by the candidate. ————————————— Date: 04-12-2018 Signature Professor Nelishia Pillay Declaration PLAGIARISM I, Thambo Nyathi, declare that: i) this dissertation has not been submitted in full or in part for any degree or examination to any other university; ii) this dissertation does not contain other persons’ data, pictures, graphs or other informa- tion, unless specifically acknowledged as being sourced from other persons; iii) this dissertation does not contain other persons’ writing, unless specifically acknowl- edged as being sourced from other researchers. Where other written sources have been quoted, then: a) their words have been re-written but the general information attributed to them has been referenced; b) where their exact words have been used, their writing has been placed inside quotation marks, and referenced; iv) where I have used material for which publications followed, I have indicated in detail my role in the work; v) this thesis is primarily a collection of material, prepared by myself, published as journal articles or presented as a poster and oral presentations at conferences.
    [Show full text]
  • Arxiv:2005.04151V1 [Cs.NE] 25 Apr 2020
    Noname manuscript No. (will be inserted by the editor) Swarm Programming Using Moth-Flame Optimization and Whale Optimization Algorithms Tapas Si Received: date / Accepted: date Abstract Automatic programming (AP) is an important area of Machine Learning (ML) where computer programs are generated automatically. Swarm Programming (SP), a newly emerging research area in AP, automatically gen- erates the computer programs using Swarm Intelligence (SI) algorithms. This paper presents two grammar-based SP methods named as Grammatical Moth- Flame Optimizer (GMFO) and Grammatical Whale Optimizer (GWO). The Moth-Flame Optimizer and Whale Optimization algorithm are used as search engines or learning algorithms in GMFO and GWO respectively. The proposed methods are tested on Santa Fe Ant Trail, quartic symbolic regression, and 3-input multiplexer problems. The results are compared with Grammatical Bee Colony (GBC) and Grammatical Fireworks algorithm (GFWA). The ex- perimental results demonstrate that the proposed SP methods can be used in automatic computer program generation. Keywords Automatic Programming · Swarm Programming · Moth-Flame Optimizer · Whale Optimization Algorithm 1 Introduction Automatic programming [1] is a machine learning technique by which com- puter programs are generated automatically in any arbitrary language. SP [3] is an automatic programming technique which uses SI algorithms as search en- gine or learning algorithms. The grammar-based SP is a type of SP in which Context-free Grammar (CFG) is used to generate computer programs in a target language. Genetic Programming (GP) [2] is a evolutionary algorithm T. Si Department of Computer Science & Engineering arXiv:2005.04151v1 [cs.NE] 25 Apr 2020 Bankura Unnayani Institute of Engineering Bankura-722146, West Bengal, India E-mail: [email protected] 2 Tapas Si in which tree-structured genome is used to represent a computer program and Genetic algorithm (GA) is used as a learning algorithm.
    [Show full text]
  • Grammatical Evolution: a Tutorial Using Gramevol
    Grammatical Evolution: A Tutorial using gramEvol Farzad Noorian, Anthony M. de Silva, Philip H.W. Leong July 18, 2020 1 Introduction Grammatical evolution (GE) is an evolutionary search algorithm, similar to genetic programming (GP). It is typically used to generate programs with syntax defined through a grammar. The original author's website [1] is a good resource for a formal introduction to this technique: • http://www.grammatical-evolution.org/ This document serves as a quick and informal tutorial on GE, with examples implemented using the gramEvol package in R. 2 Grammatical Evolution The goal of using GE is to automatically generate a program that minimises a cost function: 1.A grammar is defined to describe the syntax of the programs. 2.A cost function is defined to assess the quality (the cost or fitness) of a program. 3. An evolutionary algorithm, such as GA, is used to search within the space of all programs definable by the grammar, in order to find the program with the lowest cost. Notice that by a program, we refer to any sequence of instructions that perform a specific task. This ranges from a single expression (e.g., sin(x)), to several statements with function declarations, assignments, and control flow. The rest of this section will describe each component in more details. 2.1 Grammar A grammar is a set of rules that describe the syntax of sentences and expressions in a language. While grammars were originally invented for studying natural languages, they are extensively used in computer science for describing programming languages. 2.1.1 Informal introduction to context-free grammars GE uses a context-free grammar to describe the syntax of programs.
    [Show full text]
  • Extending Grammatical Evolution to Evolve Digital Surfaces with Genr8
    Extending Grammatical Evolution to Evolve Digital Surfaces with Genr8 Martin Hemberg1 and Una-May O'Reilly2 1 Department of Bioengineering, Bagrit Centre, Imperial College London, South Kensington Campus, London SW7 2AZ, UK [email protected] 2 Computer Science and Arti¯cial Intelligence Lab, Massachusetts Institute of Technology, Cambridge, MA 02139, USA [email protected] Abstract. Genr8 is a surface design tool for architects. It uses a grammar- based generative growth model that produces surfaces with an organic quality. Grammatical Evolution is used to help the designer search the universe of possible surfaces. We describe how we have extended Gram- matical Evolution, in a general manner, in order to handle the grammar used by Genr8. 1 Genr8:The Evolution of Digital Surfaces We have built a software tool for architects named Genr8. Genr8 'grows' dig- ital surfaces within the modeling environment of the CAD tool Maya. One of Genr8's digital surfaces, is shown on the left of Figure 1. Beside it, is a real (i.e. physical) surface that was evolved ¯rst in Genr8 then subsequently fabri- cated and assembled. A surface in Genr8 starts as an axiomatic closed polygon. It grows generatively via simultaneous, multidimensional rewriting. During growth, a surface's de¯nition is also influenced by the simulated physics of a user de¯ned 'environment' that has attractors, repellors and boundaries as its features. This integration of simulated physics and tropic influences produces surfaces with an organic appearance. In order to automate the speci¯cation, exploration and adaptation of Genr8's digital surfaces, an extended version of Grammatical Evolution (GE) [7] has been developed.
    [Show full text]
  • Artificial Neural Networks Generation Using Grammatical Evolution
    Artificial Neural Networks Generation Using Grammatical Evolution Khabat Soltanian1, Fardin Akhlaghian Tab2, Fardin Ahmadi Zar3, Ioannis Tsoulos4 1Department of Software Engineering, University of Kurdistan, Sanandaj, Iran, [email protected] 2Department of Engineering, University of Kurdistan, Sanandaj, Iran, [email protected] 3Department of Engineering, University of Kurdistan, Sanandaj, Iran, [email protected] 4Department of Computer Science, University of Ioannina, Ioannina, Greece, [email protected] Abstract: in this paper an automatic artificial neural network BP). GE is previously used for neural networks generation method is described and evaluated. The proposed construction and training and successfully tested on method generates the architecture of the network by means of multiple classification and regression benchmarks [3]. In grammatical evolution and uses back propagation algorithm for training it. In order to evaluate the performance of the method, [3] the grammatical evolution creates the architecture of a comparison is made against five other methods using a series the neural networks and also suggest a set for the weights of classification benchmarks. In the most cases it shows the of the constructed neural network. Whereas the superiority to the compared methods. In addition to the good grammatical evolution is established for automatic experimental results, the ease of use is another advantage of the programming tasks not the real number optimization. method since it works with no need of experts. Thus, we altered the grammar to generate only the Keywords- artificial neural networks, evolutionary topology of the network and used BP algorithm for computing, grammatical evolution, classification problems. training it. The following section reviews several evolutionary 1.
    [Show full text]
  • A Tool for Estimation of Grammatical Evolution Models
    AutoGE: A Tool for Estimation of Grammatical Evolution Models Muhammad Sarmad Ali a, Meghana Kshirsagar b, Enrique Naredo c and Conor Ryan d Biocomputing and Developmental Systems Lab, University of Limerick, Ireland Keywords: Grammatical Evolution, Symbolic Regression, Production Rule Pruning, Effective Genome Length. Abstract: AutoGE (Automatic Grammatical Evolution), a new tool for the estimation of Grammatical Evolution (GE) parameters, is designed to aid users of GE. The tool comprises a rich suite of algorithms to assist in fine tuning BNF grammar to make it adaptable across a wide range of problems. It primarily facilitates the identification of optimal grammar structures, the choice of function sets to achieve improved or existing fitness at a lower computational overhead over the existing GE setups. This research work discusses and reports initial results with one of the key algorithms in AutoGE, Production Rule Pruning, which employs a simple frequency- based approach for identifying less worthy productions. It captures the relationship between production rules and function sets involved in the problem domain to identify optimal grammar structures. Preliminary studies on a set of fourteen standard Genetic Programming benchmark problems in the symbolic regression domain show that the algorithm removes less useful terminals and production rules resulting in individuals with shorter genome lengths. The results depict that the proposed algorithm identifies the optimal grammar structure for the symbolic regression problem domain to be arity-based grammar. It also establishes that the proposed algorithm results in enhanced fitness for some of the benchmark problems. 1 INTRODUCTION of fitness functions that gives a fitness score to the individuals in the population.
    [Show full text]
  • Grammatical Evolution
    Chapter 2 Grammatical Evolution Genetic Programming (GP) is a population-based search method based upon neo-Darwinian principles of evolution, which manipulates executable struc- tures. It can be used to automatically generate computer code to solve real- world problems in a broad array of application domains [173]. Grammatical Evolution(GE) (see [154, 148, 24, 54, 55, 151, 190, 160, 86, 25, 43, 7, 163]) is a grammar-based form of GP. It marries principles from molecular biology to the representational power of formal grammars. GE’s rich modularity gives a unique flexibility, making it possible to use alterna- tive search strategies, whether evolutionary, or some other heuristic (be it stochastic or deterministic) and to radically change its behaviour by merely changing the grammar supplied. As a grammar is used to describe the struc- tures that are generated by GE, it is trivial to modify the output structures by simply editing the plain text grammar. The explicit grammar allows GE to easily generate solutions in any language (or a useful subset of a language). For example, GE has been used to generate solutions in multiple languages including Lisp, Scheme, C/C++, Java, Prolog, Postscript, and English. The ease with which a user can manipulate the output structures by simply writ- ing or modifying a grammar in a text file provides an attractive flexibility and ease of application not as readily enjoyed with the standard approach to Genetic Programming. The grammar also implicitly provides a mechanism by which type information can be encoded thus overcoming the property of closure, which limits the traditional representation adopted by Genetic Pro- gramming to a single type.
    [Show full text]
  • Symbolic Regression for Knowledge Discovery – Bloat, Overfitting, And
    JOHANNES KEPLER UNIVERSITAT¨ LINZ JKU Technisch-Naturwissenschaftliche Fakult¨at Symbolic Regression for Knowledge Discovery { Bloat, Overfitting, and Variable Interaction Networks DISSERTATION zur Erlangung des akademischen Grades Doktor im Doktoratsstudium der Technischen Wissenschaften Eingereicht von: Dipl.-Ing. Gabriel Kronberger Angefertigt am: Institut f¨urformale Modelle und Verifikation Beurteilung: Priv.-Doz. Dipl.-Ing. Dr. Michael Affenzeller (Betreuung) Assoc.-Prof. Dipl.-Ing. Dr. Ulrich Bodenhofer Linz, 12, 2010 For Gabi Acknowledgments This thesis would not have been possible without the support of a number of people. First and foremost I want to thank my adviser Dr. Michael A®enzeller, who has not only become my mentor but also a close friend in the years since I ¯rst started my forays into the area of evolutionary computation. Furthermore, I would like to express my gratitude to my second adviser Dr. Ulrich Bodenhofer for introducing me to the statistical elements of machine learning and en- couraging me to critically scrutinize modeling results. I am also thankful to Prof. Dr. Witold Jacak, dean of the School of Informatics, Communications and Media in Hagenberg of the Upper Austria University of Applied Sciences, for giving me the opportunity to work at the research center Hagenberg and allowing me to work on this thesis. I thank all members of the research group for heuristic and evolutionary algorithms (HEAL). Andreas Beham, for implementing standard variants of heuristic algorithms in HeuristicLab. Michael Kommenda, for discussing extensions and improvements of symbolic regression for practical applications, some of which are described in this thesis. Dr. Stefan Wagner, for tirelessly improving HeuristicLab, which has been used as the software environment for all experiments presented in this thesis.
    [Show full text]
  • On the Locality of Grammatical Evolution
    On the Locality of Grammatical Evolution Franz Rothlauf and Marie Oetzel Department of Business Administration and Information Systems University of Mannheim, 68131 Mannheim/Germany [email protected] Abstract. This paper investigates the locality of the genotype- phenotype mapping (representation) used in grammatical evolution (GE). The results show that the representation used in GE has prob- lems with locality as many neighboring genotypes do not correspond to neighboring phenotypes. Experiments with a simple local search strat- egy reveal that the GE representation leads to lower performance for mutation-based search approaches in comparison to standard GP repre- sentations. The results suggest that locality issues should be considered for further development of the representation used in GE. 1 Introduction Grammatical Evolution (GE) [1] is a variant of Genetic Programming (GP) [2] that can evolve complete programs in an arbitrary language using a variable- length binary string. In GE, phenotypic expressions are created from binary genotypes by using a complex representation (genotype-phenotype mapping). The representation selects production rules in a Backus-Naur form grammar and thereby creates a phenotype. GE approaches have been applied to test problems and real-world applications and good performance has been reported [1,3,4]. The locality of a genotype-phenotype mapping describes how well genotypic neighbors correspond to phenotypic neighbors. Previous work has indicated that a high locality of representations is necessary for efficient evolutionary search [5– 9]. Until now locality has mainly been used in the context of standard genetic algorithms to explain performance differences. The purpose of this paper is to investigate the locality of the genotype- phenotype mapping used in GE.
    [Show full text]
  • Structured Grammatical Evolution Applied to Program Synthesis
    Structured Grammatical Evolution Applied to Program Synthesis by Andrew H. Zhang Submitted to the Department of Electrical Engineering and Computer Science in Partial Fulfillment of the Requirements for the Degree of Master of Engineering in Electrical Engineering and Computer Science at the Massachusetts Institute of Technology May 2019 © 2019 Andrew H. Zhang. All rights reserved. The author hereby grants to M.I.T. permission to reproduce and to distribute publicly paper and electronic copies of this thesis document in whole and in part in any medium now known or hereafter created. Author: _____________________________________________________ Department of Electrical Engineering and Computer Science May 20, 2019 Certified by: ________________________________________________________ Una-May O’Reilly Principal Research Scientist, MIT CSAIL Thesis Supervisor May 20, 2019 Certified by: _________________________________________________ Erik Hemberg Research Scientist, MIT CSAIL Thesis Co-Supervisor May 20, 2019 1 Structured Grammatical Evolution Applied to Program Synthesis By Andrew Zhang Submitted to the Department of Electrical Engineering and Computer Science May 28, 2018 In Partial Fulfillment of the Requirements for the Degree of Master of Engineering in Electrical Engineering and Computer Science Abstract Grammatical Evolution (GE) is an evolutionary algorithm that is gaining popularity due to its ability to solve problems where it would be impossible to explore every solution within a realistic time. Structured Grammatical Evolution (SGE) was developed to overcome some of the shortcomings of GE, such as locality issues as well as wrapping around the genotype to 2 complete the phenotype .​ In this paper, we apply SGE to program synthesis, where the ​ computer must generate code to solve algorithmic problems. SGE was improved upon, because 2 the current definition of SGE ​ does not work.
    [Show full text]
  • GEF: a Self-Programming Robot Using Grammatical Evolution
    Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence GEF: A Self-Programming Robot Using Grammatical Evolution Charles Peabody and Jennifer Seitzer Department of Mathematics and Computer Science Rollins College, Winter Park, FL 32789 {cpeabody, jseitzer}@rollins.edu http://myweb.rollins.edu/JSEITZER Abstract automatically create an optimal robot controller-program to Grammatical Evolution (GE) is that area of genetic direct a Finch robot to solve simple navigation problems. algorithms that evolves computer programs in high-level languages possessing a BNF grammar. In this work, we Grammatical Evolution present GEF (“Grammatical Evolution for the Finch”), a Genetic algorithms (GA) is the basis for evolutionary system that employs grammatical evolution to create a computation which iteratively evolves solutions to Finch robot controller program in Java. The system uses both the traditional GE model as well as employing problems using a continuous iterative algorithm of extensions and augmentations that push the boundaries of generate–measure–select of evolving solutions called goal-oriented contexts in which robots typically act chromosomes [Holland 1995]. including a meta-level handler that fosters a level of self- awareness in the robot. To handle contingencies, the GEF system has been endowed with the ability to perform meta-level jumps. When confronted with unplanned events and dynamic changes in the environment, our robot will automatically transition to pursue another goal, changing fitness functions, and generate and invoke operating system level scripting to facilitate the change. The robot houses a raspberry pi controller that is capable of executing one (evolved) program while wirelessly receiving another over an asynchronous client.
    [Show full text]