Library for Systematic Search for Expressions 1 Introduction 2

Total Page:16

File Type:pdf, Size:1020Kb

Library for Systematic Search for Expressions 1 Introduction 2 Library for Systematic Search for Expressions SUSUMU KATAYAMA Department of Computer Science and Systems Engineering University of Miyazaki 1-1 W. Gakuenkibanadai Miyazaki, Miyazaki 889-2155 JAPAN [email protected] http://nautilus.cs.miyazaki-u.ac.jp/˜skata Abstract: - In our previous work we showed that systematic search approach to inductive functional program- ming automation makes a remarkably efficient algorithm, but the applicability of the implemented program was limited by the very poor interpreter incorporated. In this paper, we present a library-based implementation of our systematic search algorithm which is supposed to be used with a well-known Haskell interpreter. Key-Words: - inductive program synthesis, functional programming, library design, metaprogramming, Template Haskell, artificial intelligence 1 Introduction efficient than elaborated systematic exhaustive search for type-correct programs? If so, when and how is MagicHaskeller is our inductive programming automa- it? Has someone ever compared them?” In order to tion system by (exhaustively-and-efficiently-)generate- do the right things in the right order, in this project and-filter scheme for the lazy functional language we first elaborate on optimized implementation of Haskell. Until our previous work[1][2] it has been im- systematic exhaustive search for type correct programs, plemented as a stand alone program including a cheap and then, if we become certain that we need heuristic interpreter for filtration of the generated programs. In approaches and that they really help improving the this research we implemented its API library package efficiency, we will build heuristic approaches on the which is supposed to be used with Glasgow Haskell top of our research on efficient systematic search Compiler interactive (GHCi) version 6.4 and higher. algorithms under type constraints. Using techniques The library implementation enjoys the following ef- from software science such as monad for breadth-first fects (some of which will be discussed further in Sec- search[3], generalized trie[4], monad transformer[5], tion 4): and memoization using lazy data type[2], we have already shown that systematic exhaustive search ² user-defined data types can be used; for type-correct programs is a powerful alternative ² the full language features and libraries that GHC to genetic programming. The implemented pro- provides are supported for processing generated gram and library can be downloaded and tried from expressions; http://nautilus.cs.miyazaki-u.ac.jp/~skata/ MagicHaskeller.html. ² the usability when one wants to choose from differ- ent primitive sets has improved; for example, users can predefine reusable chunks of primitives such 2.2 Template Haskell as list-related one, nat-related one, etc. and append them to obtain a suitable primitive set for the syn- Template Haskell (TH) [6] brings template metapro- thesis to be conducted. gramming to Haskell. In TH, expressions, sets of dec- ² larations, and types to be spliced have type ExpQ, it has become easier to investigate the properties of Q [Dec], and TypeQ respectively, where ExpQ = Q Exp the set of all the expressions that can be generated and TypeQ = Q Type, where monad Q is a wrapper to from the given type and the given primitive com- avoid name collision. A code block can be spliced by us- ponent set. ing $(.). An expression, a set of declarations, and a type can be quoted by using [j.j], [dj.j], and [tj.j] respectively. 2 Background For each of the primitive components we need the in- formation of both of its semantics and its syntax, i.e., the 2.1 MagicHaskeller value of the component which is used for interpretation and the name of the component which is used to print MagicHaskeller[1][2] is a project which emerged from the generated expression. The most important role of our question: “Is genetic programming really more TH in our library is to generate both of them without re- quiring users to write the same name twice, which will setPrimitives $(p [j(+) :: Num a ) a ! a ! aj]) be explained further in the definition of function p in Also, you have to specify the type unless you are us- the next section. ing a monomorphic component (just like when using the dynamic expression of Concurrent Clean), and thus you may write setPrimitives $ (p [j'A'j]), while you 3 Library specification have to write setPrimitives $(p [j[] :: [a]j]) instead of setPrimitives $(p [j[]j]) This section is devoted to the documentation of the im- plemented library. For each of the exported functions, its type and its short description are given. p :: TH.ExpQ ! TH.ExpQ type Primitive = (HValue, TH.Exp, String) 3.1 Re-exported modules p is used to convert your primitive component set This library implicitly re-exports the entities into its internal form. Usually its argument is the quasi- from module Language.Haskell.TH as TH and quote of a tuple of primitive components, but when it is module Data.Typeable from the Standard Hierarchical not, singleton list will be generated. Library of Haskell. Please refer to their documentations The internal representation of each item of the on types from them — in this documentation, types primitive component set has type Primitive = from TH are all qualified and the only type class used (HValue, TH.Exp, String), where newtype HValue = from module Typeable is Typeable.Typeable. HV (forall a.a) is a universally quantified value rep- The following types are assigned to our internal data resenting the semantics of the primitive component, representations: type Primitive representing each prim- TH.Exp represents its syntactic aspect, and String is a itive combinator and type Memo representing the mem- string representation of its type. For example, p [j((+) :: oization table. Int ! Int ! Int, 0 :: Int, 'A', [] :: [a])j] reduces to1 [(HV (unsafeCoerce#((+) :: Int ! Int ! Int)), 3.2 Setting up the synthesis TH.VarE (TH.mkName "+"), "(((->) Int) (((->) Int) Int))"), Before synthesis, you have to define at least a memo- (HV (unsafeCoerce#(0 :: Int)), ization table (or you may define one once and reuse it TH.LitE (TH.IntegerL 0), for later syntheses). Other parameters are memoization "Int"), depth and time out interval, which have default values. (HV (unsafeCoerce#'A'), You may elect either to set those values to the `global TH.LitE (TH.CharL 'A'), variables' using `set*' functions, or hand them explicitly show (typeOf 'A')), as parameters. (HV (unsafeCoerce#([ ] :: forall a 0.[a 0])), Note that the functions related to global variables are TH.ConE (TH.mkName "[]"), supposed to be used within the GHCi environment, and "([] a_0)")] are not tested on other environments. They are imple- Use of HValue and unsafeCoerce# for such purposes is mented in the same way as GHCi implements global taken from the source code of GHCi. The type consis- variables. For example, the global memoization table is tency will not be assured statically by GHC but dynam- implemented as: f g ically by our library. When the type is not given, it will -# NOINLINE refmemodeb #- be obtained via typeOf method as is seen in the 'A' case refmemodeb :: IORef Memo of the above example, although that will cause ambigu- = ( ) refmemodeb unsafePerformIO newIORef defaultMD ity error if the value is polymorphic. defaultMD = mkMemo [] ! Functions for creating memoization tables from the setPrimitives :: [Primitive] IO () primitive components You can set your primitives setPrimitives = setMemo.mkMemo ! like, e.g., mkMemo :: [Primitive] Memo ! setPrimitives setMemo :: Memo IO () $(p [j(+) :: Int ! Int ! Int, 0 :: Int, 'A', [] :: [a]j]) where the primitive set is consisted of (+) special- ized to type Int, 0 specialized to type Int, 'A' which Memoization depth has monomorphic type Char, and [] with polymor- phic type [a]. As primitive components one can in- ! clude any variables and constructors within the scope setDepth :: Int -- memoization depth unless data types holding functions such as [a ! b], IO () (Int ! Char, Bool), etc. are involved. However, be- cause currently ad hoc polymorphism is not supported 1Here we stripped some of the qualifications by module names in by this library, you may not write order to avoid clutter. setDepth can be used to set the memoization depth. Quick starters (Sub)expressions within this size are memoized, while greater expressions will be recomputed (to save the findOne :: Typeable a ) (a ! Bool) ! TH.Exp heap space). printOne :: Typeable a ) (a ! Bool) ! IO () printAny :: Typeable a ) (a ! Bool) ! IO () Functions related to time out Because the library gen- erates all the expressions including those with non- findOne pred finds an expression e that satisfies linear recursions, you should note that there exist some pred e ´ True, and returns it in TH.Exp. printOne prints expressions which take extraordinarily long time.[2] the expression found first. printAny prints all the ex- Imagine a function that takes an integer n and incre- n pressions satisfying the given predicate. ments 0 for 22 times. Another example that has smaller program size is blah in Incremental filtration blah :: Int ! Int ! Int Sometimes you may want to blah 0 i = i filter further after synthesis, because the predicate blah k i = foo (blah (k ¡ 1)) i you previously provided did not specify the function foo h 0 = 2 enough. The following functions can be used to filter foo h s = h (f h (s ¡ 1)) expressions incrementally. Giving small values in your examples is a good policy, but even then, time out is useful. For this reason, time filterFirst :: Typeable a ) out is by default taken after 1 second since each invo- (a ! Bool) ! IO [[(TH.Exp, a)]] cation of evaluation. This default behavior can be over- filterThen :: Typeable a ) ridden by the following functions. (a ! Bool) ! [[(TH.Exp, a)]] ! IO [[(TH.Exp, a)]] setTimeout :: Int ! -- time in seconds filterFirst is like printAny, but by itself it does not print IO () anything.
Recommended publications
  • Approaches and Applications of Inductive Programming
    Report from Dagstuhl Seminar 17382 Approaches and Applications of Inductive Programming Edited by Ute Schmid1, Stephen H. Muggleton2, and Rishabh Singh3 1 Universität Bamberg, DE, [email protected] 2 Imperial College London, GB, [email protected] 3 Microsoft Research – Redmond, US, [email protected] Abstract This report documents the program and the outcomes of Dagstuhl Seminar 17382 “Approaches and Applications of Inductive Programming”. After a short introduction to the state of the art to inductive programming research, an overview of the introductory tutorials, the talks, program demonstrations, and the outcomes of discussion groups is given. Seminar September 17–20, 2017 – http://www.dagstuhl.de/17382 1998 ACM Subject Classification I.2.2 Automatic Programming, Program Synthesis Keywords and phrases inductive program synthesis, inductive logic programming, probabilistic programming, end-user programming, human-like computing Digital Object Identifier 10.4230/DagRep.7.9.86 Edited in cooperation with Sebastian Seufert 1 Executive summary Ute Schmid License Creative Commons BY 3.0 Unported license © Ute Schmid Inductive programming (IP) addresses the automated or semi-automated generation of computer programs from incomplete information such as input-output examples, constraints, computation traces, demonstrations, or problem-solving experience [5]. The generated – typically declarative – program has the status of a hypothesis which has been generalized by induction. That is, inductive programming can be seen as a special approach to machine learning. In contrast to standard machine learning, only a small number of training examples is necessary. Furthermore, learned hypotheses are represented as logic or functional programs, that is, they are represented on symbol level and therefore are inspectable and comprehensible [17, 8, 18].
    [Show full text]
  • Download and Use25
    Journal of Artificial Intelligence Research 1 (1993) 1-15 Submitted 6/91; published 9/91 Inductive logic programming at 30: a new introduction Andrew Cropper [email protected] University of Oxford Sebastijan Dumanˇci´c [email protected] KU Leuven Abstract Inductive logic programming (ILP) is a form of machine learning. The goal of ILP is to in- duce a hypothesis (a set of logical rules) that generalises given training examples. In contrast to most forms of machine learning, ILP can learn human-readable hypotheses from small amounts of data. As ILP approaches 30, we provide a new introduction to the field. We introduce the necessary logical notation and the main ILP learning settings. We describe the main building blocks of an ILP system. We compare several ILP systems on several dimensions. We describe in detail four systems (Aleph, TILDE, ASPAL, and Metagol). We document some of the main application areas of ILP. Finally, we summarise the current limitations and outline promising directions for future research. 1. Introduction A remarkable feat of human intelligence is the ability to learn new knowledge. A key type of learning is induction: the process of forming general rules (hypotheses) from specific obser- vations (examples). For instance, suppose you draw 10 red balls out of a bag, then you might induce a hypothesis (a rule) that all the balls in the bag are red. Having induced this hypothesis, you can predict the colour of the next ball out of the bag. The goal of machine learning (ML) (Mitchell, 1997) is to automate induction.
    [Show full text]
  • Bias Reformulation for One-Shot Function Induction
    Bias reformulation for one-shot function induction Dianhuan Lin1 and Eyal Dechter1 and Kevin Ellis1 and Joshua B. Tenenbaum1 and Stephen H. Muggleton2 Abstract. In recent years predicate invention has been under- Input Output explored as a bias reformulation mechanism within Inductive Logic Task1 miKe dwIGHT Mike Dwight Programming due to difficulties in formulating efficient search mech- Task2 European Conference on ECAI anisms. However, recent papers on a new approach called Meta- Artificial Intelligence Interpretive Learning have demonstrated that both predicate inven- Task3 My name is John. John tion and learning recursive predicates can be efficiently implemented for various fragments of definite clause logic using a form of abduc- tion within a meta-interpreter. This paper explores the effect of bias reformulation produced by Meta-Interpretive Learning on a series Figure 1. Input-output pairs typifying string transformations in this paper of Program Induction tasks involving string transformations. These tasks have real-world applications in the use of spreadsheet tech- nology. The existing implementation of program induction in Mi- crosoft’s FlashFill (part of Excel 2013) already has strong perfor- problems are easy for us, but often difficult for automated systems, mance on this problem, and performs one-shot learning, in which a is that we bring to bear a wealth of knowledge about which kinds of simple transformation program is generated from a single example programs are more or less likely to reflect the intentions of the person instance and applied to the remainder of the column in a spreadsheet. who wrote the program or provided the example. However, no existing technique has been demonstrated to improve There are a number of difficulties associated with successfully learning performance over a series of tasks in the way humans do.
    [Show full text]
  • On the Di Iculty of Benchmarking Inductive Program Synthesis Methods
    On the Diiculty of Benchmarking Inductive Program Synthesis Methods Edward Pantridge omas Helmuth MassMutual Financial Group Washington and Lee University Amherst, Massachuses, USA Lexington, Virginia [email protected] [email protected] Nicholas Freitag McPhee Lee Spector University of Minnesota, Morris Hampshire College Morris, Minnesota 56267 Amherst, Massachuses, USA [email protected] [email protected] ABSTRACT ACM Reference format: A variety of inductive program synthesis (IPS) techniques have Edward Pantridge, omas Helmuth, Nicholas Freitag McPhee, and Lee Spector. 2017. On the Diculty of Benchmarking Inductive Program Syn- recently been developed, emerging from dierent areas of com- thesis Methods. In Proceedings of GECCO ’17 Companion, Berlin, Germany, puter science. However, these techniques have not been adequately July 15-19, 2017, 8 pages. compared on general program synthesis problems. In this paper we DOI: hp://dx.doi.org/10.1145/3067695.3082533 compare several methods on problems requiring solution programs to handle various data types, control structures, and numbers of outputs. e problem set also spans levels of abstraction; some 1 INTRODUCTION would ordinarily be approached using machine code or assembly Since the creation of Inductive Program Synthesis (IPS) in the language, while others would ordinarily be approached using high- 1970s [15], researchers have been striving to create systems capa- level languages. e presented comparisons are focused on the ble of generating programs competitively with human intelligence. possibility of success; that is, on whether the system can produce Modern IPS methods oen trace their roots to the elds of machine a program that passes all tests, for all training and unseen test- learning, logic programming, evolutionary computation and others.
    [Show full text]
  • Learning Functional Programs with Function Invention and Reuse
    Learning functional programs with function invention and reuse Andrei Diaconu University of Oxford arXiv:2011.08881v1 [cs.PL] 17 Nov 2020 Honour School of Computer Science Trinity 2020 Abstract Inductive programming (IP) is a field whose main goal is synthe- sising programs that respect a set of examples, given some form of background knowledge. This paper is concerned with a subfield of IP, inductive functional programming (IFP). We explore the idea of generating modular functional programs, and how those allow for function reuse, with the aim to reduce the size of the programs. We introduce two algorithms that attempt to solve the problem and explore type based pruning techniques in the context of modular programs. By experimenting with the imple- mentation of one of those algorithms, we show reuse is important (if not crucial) for a variety of problems and distinguished two broad classes of programs that will generally benefit from func- tion reuse. Contents 1 Introduction5 1.1 Inductive programming . .5 1.2 Motivation . .6 1.3 Contributions . .7 1.4 Structure of the report . .8 2 Background and related work 10 2.1 Background on IP . 10 2.2 Related work . 12 2.2.1 Metagol . 12 2.2.2 Magic Haskeller . 13 2.2.3 λ2 ............................. 13 2.3 Invention and reuse . 13 3 Problem description 15 3.1 Abstract description of the problem . 15 3.2 Invention and reuse . 19 4 Algorithms for the program synthesis problem 21 4.1 Preliminaries . 22 4.1.1 Target language and the type system . 22 4.1.2 Combinatorial search .
    [Show full text]
  • Inductive Programming Meets the Real World
    Inductive Programming Meets the Real World Sumit Gulwani José Hernández-Orallo Emanuel Kitzelmann Microsoft Corporation, Universitat Politècnica de Adam-Josef-Cüppers [email protected] València, Commercial College, [email protected] [email protected] Stephen H. Muggleton Ute Schmid Benjamin Zorn Imperial College London, UK. University of Bamberg, Microsoft Corporation, [email protected] Germany. [email protected] [email protected] Since most end users lack programming skills they often thesis, now called inductive functional programming (IFP), spend considerable time and effort performing tedious and and from inductive inference techniques using logic, nowa- repetitive tasks such as capitalizing a column of names man- days termed inductive logic programming (ILP). IFP ad- ually. Inductive Programming has a long research tradition dresses the synthesis of recursive functional programs gener- and recent developments demonstrate it can liberate users alized from regularities detected in (traces of) input/output from many tasks of this kind. examples [42, 20] using generate-and-test approaches based on evolutionary [35, 28, 36] or systematic [17, 29] search Key insights or data-driven analytical approaches [39, 6, 18, 11, 37, 24]. Its development is complementary to efforts in synthesizing • Supporting end-users to automate complex and programs from complete specifications using deductive and repetitive tasks using computers is a big challenge formal methods [8]. for which novel technological breakthroughs are de- ILP originated from research on induction in a logical manded. framework [40, 31] with influence from artificial intelligence, • The integration of inductive programing techniques machine learning and relational databases. It is a mature in software applications can support users by learning area with its own theory, implementations, and applications domain specific programs from observing interactions and recently celebrated the 20th anniversary [34] of its in- of the user with the system.
    [Show full text]
  • Analytical Inductive Programming As a Cognitive Rule Acquisition Devise∗
    AGI-2009 - Published by Atlantis Press, © the authors <1> Analytical Inductive Programming as a Cognitive Rule Acquisition Devise∗ Ute Schmid and Martin Hofmann and Emanuel Kitzelmann Faculty Information Systems and Applied Computer Science University of Bamberg, Germany {ute.schmid, martin.hofmann, emanuel.kitzelmann}@uni-bamberg.de Abstract desired one given enough time, analytical approaches One of the most admirable characteristic of the hu- have a more restricted language bias. The advantage of man cognitive system is its ability to extract gener- analytical inductive programming is that programs are alized rules covering regularities from example expe- synthesized very fast, that the programs are guaranteed rience presented by or experienced from the environ- to be correct for all input/output examples and fulfill ment. Humans’ problem solving, reasoning and verbal further characteristics such as guaranteed termination behavior often shows a high degree of systematicity and being minimal generalizations over the examples. and productivity which can best be characterized by a The main goal of inductive programming research is to competence level reflected by a set of recursive rules. provide assistance systems for programmers or to sup- While we assume that such rules are different for dif- port end-user programming (Flener & Partridge 2001). ferent domains, we believe that there exists a general mechanism to extract such rules from only positive ex- From a broader perspective, analytical inductive pro- amples from the environment. Our system Igor2 is an gramming provides algorithms for extracting general- analytical approach to inductive programming which ized sets of recursive rules from small sets of positive induces recursive rules by generalizing over regularities examples of some behavior.
    [Show full text]
  • A Visual Language for Composable Inductive Programming
    A Visual Language for Composable Inductive Programming Edward McDaid FBCS Sarah McDaid PhD Chief Technology Officer Head of Digital Zoea Ltd Zoea Ltd Abstract We present Zoea Visual which is a visual programming language based on the Zoea composable inductive programming language. Zoea Visual allows users to create software directly from a specification that resembles a set of functional test cases. Programming with Zoea Visual involves the definition of a data flow model of test case inputs, optional intermediate values, and outputs. Data elements are represented visually and can be combined to create structures of any complexity. Data flows between elements provide additional information that allows the Zoea compiler to generate larger programs in less time. This paper includes an overview of the language. The benefits of the approach and some possible future enhancements are also discussed. 1. Introduction control structures or functions. Instead the developer simply provides a number of test cases This paper describes the visual programming showing examples of inputs and outputs as static language Zoea Visual. In order to understand the data. The Zoea compiler uses programming motivation and design of Zoea Visual it is first knowledge, pattern matching and abductive necessary to briefly recap the underlying Zoea reasoning [5] to automatically identify the required language. code [1]. Knowledge sources are organised as a Zoea is a simple declarative programming language distributed blackboard architecture [6] that allows a user to construct software directly from a The Zoea language is around 30% as complex as specification that resembles a set of functional test the simplest conventional programming languages cases [1].
    [Show full text]
  • An Analytical Inductive Functional Programming System That Avoids Unintended Programs ∗
    An Analytical Inductive Functional Programming System that Avoids Unintended Programs ∗ Susumu Katayama University of Miyazaki [email protected] Abstract Currently, two approaches to IFP are under active development. Inductive functional programming (IFP) is a research field extend- One is the analytical approach that performs pattern matching to the ing from software science to artificial intelligence that deals with given I/O example pairs (e.g., the IGOR II system [15][8]), and the functional program synthesis based on generalization from am- other is the generate-and-test approach that generates many pro- biguous specifications, usually given as input-output example pairs. grams and selects those that satisfy the given condition (e.g., the Currently, the approaches to IFP can be categorized into two gen- MAGICHASKELLER system [10] and the ADATE system [18]). eral groups: the analytical approach that is based on analysis of the Analytical methods can use the information of I/O examples for input-output example pairs, and the generate-and-test approach that limiting the size of the search space for efficiency purposes, but is based on generation and testing of many candidate programs. they have limitations on how to provide I/O example pairs as the The analytical approach shows greater promise for application to specification, i.e., beginning with the simplest input(s) and progres- greater problems because the search space is restricted by the given sively increasing their complexity. They often require many lines example set, but it requires much more examples written in order of examples because an example for more or less complex inputs to yield results that reflect the user’s intention, which is bother- is indispensable for specifying their users’ intention correctly.
    [Show full text]
  • Abstract Nonsense and Programming: Uses of Category Theory in Computer Science
    Bachelor thesis Computing Science Radboud University Abstract Nonsense and Programming: Uses of Category Theory in Computer Science Author: Supervisor: Koen Wösten dr. Sjaak Smetsers [email protected] [email protected] s4787838 Second assessor: prof. dr. Herman Geuvers [email protected] June 29, 2020 Abstract In this paper, we try to explore the various applications category theory has in computer science and see how from this application of category theory numerous concepts in computer science are analogous to basic categorical definitions. We will be looking at three uses in particular: Free theorems, re- cursion schemes and monads to model computational effects. We conclude that the applications of category theory in computer science are versatile and many, and applying category theory to programming is an endeavour worthy of pursuit. Contents Preface 2 Introduction 4 I Basics 6 1 Categories 8 2 Types 15 3 Universal Constructions 22 4 Algebraic Data types 36 5 Function Types 43 6 Functors 52 7 Natural Transformations 60 II Uses 67 8 Free Theorems 69 9 Recursion Schemes 82 10 Monads and Effects 102 11 Conclusions 119 1 Preface The original motivation for embarking on the journey of writing this was to get a better grip on categorical ideas encountered as a student assistant for the course NWI-IBC040 Functional Programming at Radboud University. When another student asks ”What is a monad?” and the only thing I can ex- plain are its properties and some examples, but not its origin and the frame- work from which it comes. Then the eventual answer I give becomes very one-dimensional and unsatisfactory.
    [Show full text]
  • Genetic Programming
    Genetic Programming James McDermott and Una-May O'Reilly Evolutionary Design and Optimization Group, Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, Massachusetts, USA. fjmmcd, [email protected] March 29, 2012 2 Contents 1 Genetic Programming 5 1.1 Introduction . .5 1.2 History . .6 1.3 Taxonomy: Upward and Downward from GP . .8 1.3.1 Upward . .8 1.3.2 Downward . .9 1.4 Well Suited Domains and Noteworthy Applications of GP . 14 1.4.1 Symbolic regression . 14 1.4.2 GP and Machine Learning . 15 1.4.3 Software Engineering . 15 1.4.4 Art, Music, and Design . 17 1.5 Research topics . 18 1.5.1 Bloat . 18 1.5.2 GP Theory . 19 1.5.3 Modularity . 21 1.5.4 Open-ended GP . 22 1.6 Practicalities . 22 1.6.1 Conferences and Journals . 22 1.6.2 Software . 23 1.6.3 Resources and Further Reading . 23 A Notes 37 A.1 Sources . 37 A.2 Frank's advice . 38 A.3 Length . 38 3 4 Chapter 1 Genetic Programming Genetic programming is the subset of evolutionary computation in which the aim is to create an executable program. It is an exciting field with many applications, some immediate and practical, others long-term and visionary. In this chapter we provide a brief history of the ideas of genetic programming. We give a taxonomy of approaches and place genetic programming in a broader taxonomy of artificial intelligence. We outline some hot topics of research and point to successful applications.
    [Show full text]
  • Inductive Logic Programming: Theory and Methods
    Inductive Logic Programming: Theory and Methods Daniel Schäfer 14 March, 2018 Abstract This paper provides an overview of the topic Inductiv Logic Program- ming (ILP). It contains an introduction to define a frame of the topic in general and will furthermore provide a deeper insight into the implemen- tations of two ILP-algorithms, the First-Order Inductive Learner (FOIL) and the Inverse Resolution. ILP belongs to the group of classifiers, but also differs from them, because it uses the first-order logic. Thepaper explains what ILP is, how it works and some approaches related to it. Additionally an example shall provide a deeper understanding and show ILPs ability for recursion. 1 Introduction Figure 1: Familytree show a basic concept of relationships between humans, that even children understand, but for machines it is hard to learn these bounds1 1Imagesource: http://the-simpson-chow.skyrock.com 1 A lot of challenges today can be solved by computers. One challenge is the clas- sification of data into classes and sub-classes. These challenges can besolved by algorithms, that need a database as background knowledge, to identify the categories and are then able to classify to which of these categories a new obser- vation belongs. A machine learning algorithm, who’s task is to map input data to a category is known as classifier. A major problem to these challenges so far is, how to teach the machine the general setting and the background knowledge it needs to understand this problem and especially the relationship between the data. Ideally in a way, that it is also human readable and adaptable on multiple settings.
    [Show full text]