Neural-Symbolic Integration

Total Page:16

File Type:pdf, Size:1020Kb

Neural-Symbolic Integration Neural-Symbolic Integration Dissertation zur Erlangung des akademischen Grades Doktor rerum naturalium (Dr. rer. nat) vorgelegt an der Technischen Universit¨atDresden Fakult¨atInformatik eingereicht von Dipl.-Inf. Sebastian Bader geb. am 22. Mai 1977 in Rostock Dresden, Oktober 2009 Gutachter: Prof. Dr. rer. nat. habil Steffen H¨olldobler (Technische Universit¨atDresden) Prof. Dr. rer. nat. habil Barbara Hammer (Technische Universit¨atClausthal) Verteidigung: 5. Oktober 2009 Neural-Symbolic Integration Sebastian Bader Dresden, October 2009 To my family. Contents 1 Introduction and Motivation1 1.1 Motivation for the Study of Neural-Symbolic Integration............2 1.2 Related Work....................................4 1.3 A Classification Scheme for Neural-Symbolic Systems.............. 12 1.4 Challenge Problems................................ 18 1.5 Structure of this Thesis.............................. 22 2 Preliminaries 25 2.1 General Notions and Notations.......................... 26 2.2 Metric Spaces, Contractive Functions and Iterated Function Systems..... 30 2.3 Logic Programs................................... 34 2.4 Binary Decision Diagrams............................. 41 2.5 Connectionist Systems............................... 45 3 Embedding Propositional Rules into Connectionist Systems 53 3.1 Embedding Semantic Operators into Threshold Networks........... 54 3.2 Embedding Semantic Operators into Sigmoidal Networks........... 58 3.3 Iterating the Computation of the Embedded Operators............. 65 3.4 Summary...................................... 67 4 An Application to Part of Speech Tagging 69 4.1 Part of Speech Tagging.............................. 70 4.2 System Architecture................................ 70 4.3 Experimental Evaluation.............................. 74 4.4 Summary...................................... 76 5 Connectionist Learning and Propositional Background Knowledge 77 5.1 Integrating Background Knowledge into the Training Process......... 78 5.2 A Simple Classification Task as Case Study................... 79 5.3 Evaluation...................................... 81 5.4 Summary...................................... 81 6 Extracting Propositional Rules from Connectionist Systems 85 6.1 The Rule Extraction Problem for Feed-Forward Networks........... 86 6.2 CoOp { A New Decompositional Approach................... 88 6.3 Decomposition of Feed-Forward Networks.................... 89 6.4 Computing Minimal Coalitions and Oppositions................. 91 6.5 Composition of Intermediate Results....................... 104 6.6 Incorporation of Integrity Constraints...................... 112 6.7 Extraction of Propositional Logic Programs................... 117 6.8 Evaluation...................................... 119 6.9 Summary...................................... 121 v 7 Embedding First-Order Rules into Connectionist Systems 123 7.1 Feasibility of the Core Method.......................... 124 7.2 Embedding Interpretations into the Real Numbers............... 126 7.3 Embedding the Consequence Operator into the Real Numbers......... 133 7.4 Approximating the Embedded Operator using Connectionist Systems..... 134 7.5 Iterating the Approximation............................ 153 7.6 Vector-Based Learning on Embedded Interpretations.............. 157 7.7 Summary...................................... 159 8 Conclusions 161 8.1 Summary...................................... 162 8.2 Challenge Problems Revised............................ 164 8.3 Further Work.................................... 167 8.4 Final Remarks................................... 172 List of Figures 1.1 The Neural-Symbolic Cycle............................4 1.2 The idea behind the Core Method........................6 1.3 The three main dimensions of our classification scheme............. 12 1.4 The details of the interrelation dimension.................... 13 1.5 The details of the language dimension...................... 13 1.6 The details of the usage dimension........................ 13 2.1 Operators defined to obtain a more compact source code and some \syntactic sugar" yielding a better readable source code.................. 27 2.2 The definition for the \evaluate-to" operator.................. 28 2.3 The definition for the \evaluate-to" operator, ctd................ 29 2.4 Implementation to construct an empty BDD and to access the ingredients of a BDD......................................... 43 2.5 Implementation to construct a BDD from two BDDs.............. 44 2.6 Base cases for the construction of a BDD from two BDDs using disjunction and conjunction..................................... 44 2.7 The construction of a BDD from a given variable and two BDDs using the if-then-else construct................................ 44 2.8 Implementation to create of nodes while constructing a reduced BDD.... 45 3.1 Activation functions tanh and sigm together with their threshold counterparts 54 3.2 Implementation to construct a behaviour-equivalent artificial neural network for a given propositional logic program........................ 57 3.3 Implementation for the construction of a behaviour equivalent network with units computing the hyperbolic tangent...................... 64 4.1 General architecture of a neural-symbolic part of speech tagger......... 71 4.2 Network architecture for part-of-speech tagging.................. 73 4.3 Comparison of MSE after training an initialised and a purely randomised network. 75 4.4 The evolution of the error on training and validation set............ 75 5.1 The rule-insertion cycle for the integration of symbolic knowledge into the con- nectionist training process............................. 78 5.2 A 3-layer fully connected feed-forward network to classify Tic-Tac-Toe boards 79 5.3 The embedding of a tic-tac-toe rule........................ 80 5.4 The development of the mean squared error over time for different embedding factors ! ....................................... 82 6.1 Implementation to compute the set of perceptrons for a given feed-forward net- work......................................... 89 6.2 Implementation to construct a search tree to guide the extraction process.. 93 6.3 Implementation to construct the pruned search tree.............. 95 vii 6.4 Implementation to construct the reduced ordered binary decision diagram rep- resenting the set of minimal coalition for a given perceptron P ........ 101 6.5 Implementation to construct the pruned search trees for coalitions and opposi- tions......................................... 103 6.6 Implementation to construct the reduced ordered BDD for coalition and the reduced ordered BDD for oppositions for a given perceptron P ........ 105 6.7 Implementation to construct the positive form of a given perceptron...... 106 6.8 Implementation to construct the negative form of a given perceptron..... 107 6.9 The expansion of some node O into BDDa ..................... 113 6.10 The extraction of a reduced ordered BDDb representing the coalitions of all out- put nodes wrt. the input nodes for a given network N .............. 114 6.11 The full extraction into a single reduced ordered BDD including the incorpora- tion of integrity constraints for all output nodes................. 117 6.12 Resulting BDD sizes of the extraction for different maxn-integrity constraints. 121 7.1 Implementation of the embedding function ι for the 1-dimensional case.... 127 7.2 Implementation for the embedding function ι for multi-dimensional embeddings 127 7.3 Two sets of embedded interpretations...................... 130 7.4 Implementation to compute a set of constant pieces which approximate a given TP -operator up to level N ............................. 136 7.5 Implementation to compute a set of approximating step functions for a given TP -operator up to level N ............................. 138 1 7.6 Three sigmoidal approximations of the step function Θ0;0 ........... 139 7.7 Implementation to compute a set of approximating sigmoidal functions for a given TP -operator up to level N .......................... 141 7.8 Implementation to construct an approximating sigmoidal network for a given TP -operator up to level N ............................. 143 7.9 Implementation to compute the set of approximating raised cosine functions for a given TP -operator up to level N ......................... 145 7.10 Implementation to construct an approximating raised cosine network for a given TP -operator up to level N ............................. 146 7.11 Implementation to compute the set of approximating reference vectors for a given TP -operator up to level N .......................... 149 7.12 Implementation to construct an approximating vector based network for a given TP -operator up to level N ............................. 152 7.13 Approximation based on the constant pieces are not necessarily contractive if the underlying fP is contractive.......................... 155 7.14 The adaptation of the input weights for a given input i ............. 158 7.15 Adding a new unit................................. 159 7.16 Removing an inutile unit.............................. 160 Preface & Acknowledgement My interest in the area of neural-symbolic integration started when I was an undergraduate student at TU Dresden. In all classes on artificial neural networks that I took, some questions remained open, among them were the following two which are key problems tackled in this the- sis: \What did the network really learn?" and \How can I help the network using background knowledge?" While being a student in the
Recommended publications
  • ECE3040: Table of Contents
    ECE3040: Table of Contents Lecture 1: Introduction and Overview Instructor contact information Navigating the course web page What is meant by numerical methods? Example problems requiring numerical methods What is Matlab and why do we need it? Lecture 2: Matlab Basics I The Matlab environment Basic arithmetic calculations Command Window control & formatting Built-in constants & elementary functions Lecture 3: Matlab Basics II The assignment operator “=” for defining variables Creating and manipulating arrays Element-by-element array operations: The “.” Operator Vector generation with linspace function and “:” (colon) operator Graphing data and functions Lecture 4: Matlab Programming I Matlab scripts (programs) Input-output: The input and disp commands The fprintf command User-defined functions Passing functions to M-files: Anonymous functions Global variables Lecture 5: Matlab Programming II Making decisions: The if-else structure The error, return, and nargin commands Loops: for and while structures Interrupting loops: The continue and break commands Lecture 6: Programming Examples Plotting piecewise functions Computing the factorial of a number Beeping Looping vs vectorization speed: tic and toc commands Passing an “anonymous function” to Matlab function Approximation of definite integrals: Riemann sums Computing cos(푥) from its power series Stopping criteria for iterative numerical methods Computing the square root Evaluating polynomials Errors and Significant Digits Lecture 7: Polynomials Polynomials
    [Show full text]
  • Package 'Mosaiccalc'
    Package ‘mosaicCalc’ May 7, 2020 Type Package Title Function-Based Numerical and Symbolic Differentiation and Antidifferentiation Description Part of the Project MOSAIC (<http://mosaic-web.org/>) suite that provides utility functions for doing calculus (differentiation and integration) in R. The main differentiation and antidifferentiation operators are described using formulas and return functions rather than numerical values. Numerical values can be obtained by evaluating these functions. Version 0.5.1 Depends R (>= 3.0.0), mosaicCore Imports methods, stats, MASS, mosaic, ggformula, magrittr, rlang Suggests testthat, knitr, rmarkdown, mosaicData Author Daniel T. Kaplan <[email protected]>, Ran- dall Pruim <[email protected]>, Nicholas J. Horton <[email protected]> Maintainer Daniel Kaplan <[email protected]> VignetteBuilder knitr License GPL (>= 2) LazyLoad yes LazyData yes URL https://github.com/ProjectMOSAIC/mosaicCalc BugReports https://github.com/ProjectMOSAIC/mosaicCalc/issues RoxygenNote 7.0.2 Encoding UTF-8 NeedsCompilation no Repository CRAN Date/Publication 2020-05-07 13:00:13 UTC 1 2 D R topics documented: connector . .2 D..............................................2 findZeros . .4 fitSpline . .5 integrateODE . .5 numD............................................6 plotFun . .7 rfun .............................................7 smoother . .7 spliner . .7 Index 8 connector Create an interpolating function going through a set of points Description This is defined in the mosaic package: See connector. D Derivative and Anti-derivative operators Description Operators for computing derivatives and anti-derivatives as functions. Usage D(formula, ..., .hstep = NULL, add.h.control = FALSE) antiD(formula, ..., lower.bound = 0, force.numeric = FALSE) makeAntiDfun(.function, .wrt, from, .tol = .Machine$double.eps^0.25) numerical_integration(f, wrt, av, args, vi.from, ciName = "C", .tol) D 3 Arguments formula A formula.
    [Show full text]
  • Calculus 141, Section 8.5 Symbolic Integration Notes by Tim Pilachowski
    Calculus 141, section 8.5 Symbolic Integration notes by Tim Pilachowski Back in my day (mid-to-late 1970s for high school and college) we had no handheld calculators, and the only electronic computers were mainframes at universities and government facilities. After learning the integration techniques that you are now learning, we were sent to Tables of Integrals, which listed results for (often) hundreds of simple to complicated integration results. Some could be evaluated using the methods of Calculus 1 and 2, others needed more esoteric methods. It was necessary to scan the Tables to find the form that was needed. If it was there, great! If not… The UMCP Physics Department posts one from the textbook they use (current as of 2017) at www.physics.umd.edu/hep/drew/IntegralTable.pdf . A similar Table of Integrals from http://www.had2know.com/academics/table-of-integrals-antiderivative-formulas.html is appended at the end of this Lecture outline. x 1 x Example F from 8.1: Evaluate e sindxx using a Table of Integrals. Answer : e ()sin x − cos x + C ∫ 2 If you remember, we had to define I, do a series of two integrations by parts, then solve for “ I =”. From the Had2Know Table of Integrals below: Identify a = and b = , then plug those values in, and voila! Now, with the development of handheld calculators and personal computers, much more is available to us, including software that will do all the work. The text mentions Derive, Maple, Mathematica, and MATLAB, and gives examples of Mathematica commands. You’ll be using MATLAB in Math 241 and several other courses.
    [Show full text]
  • Symbolic and Numerical Integration in MATLAB 1 Symbolic Integration in MATLAB
    Symbolic and Numerical Integration in MATLAB 1 Symbolic Integration in MATLAB Certain functions can be symbolically integrated in MATLAB with the int command. Example 1. Find an antiderivative for the function f(x)= x2. We can do this in (at least) three different ways. The shortest is: >>int(’xˆ2’) ans = 1/3*xˆ3 Alternatively, we can define x symbolically first, and then leave off the single quotes in the int statement. >>syms x >>int(xˆ2) ans = 1/3*xˆ3 Finally, we can first define f as an inline function, and then integrate the inline function. >>syms x >>f=inline(’xˆ2’) f = Inline function: >>f(x) = xˆ2 >>int(f(x)) ans = 1/3*xˆ3 In certain calculations, it is useful to define the antiderivative as an inline function. Given that the preceding lines of code have already been typed, we can accomplish this with the following commands: >>intoff=int(f(x)) intoff = 1/3*xˆ3 >>intoff=inline(char(intoff)) intoff = Inline function: intoff(x) = 1/3*xˆ3 1 The inline function intoff(x) has now been defined as the antiderivative of f(x)= x2. The int command can also be used with limits of integration. △ Example 2. Evaluate the integral 2 x cos xdx. Z1 In this case, we will only use the first method from Example 1, though the other two methods will work as well. We have >>int(’x*cos(x)’,1,2) ans = cos(2)+2*sin(2)-cos(1)-sin(1) >>eval(ans) ans = 0.0207 Notice that since MATLAB is working symbolically here the answer it gives is in terms of the sine and cosine of 1 and 2 radians.
    [Show full text]
  • Warren Mcculloch and the British Cyberneticians
    Warren McCulloch and the British cyberneticians Article (Accepted Version) Husbands, Phil and Holland, Owen (2012) Warren McCulloch and the British cyberneticians. Interdisciplinary Science Reviews, 37 (3). pp. 237-253. ISSN 0308-0188 This version is available from Sussex Research Online: http://sro.sussex.ac.uk/id/eprint/43089/ This document is made available in accordance with publisher policies and may differ from the published version or from the version of record. If you wish to cite this item you are advised to consult the publisher’s version. Please see the URL above for details on accessing the published version. Copyright and reuse: Sussex Research Online is a digital repository of the research output of the University. Copyright and all moral rights to the version of the paper presented here belong to the individual author(s) and/or other copyright owners. To the extent reasonable and practicable, the material made available in SRO has been checked for eligibility before being made available. Copies of full text items generally can be reproduced, displayed or performed and given to third parties in any format or medium for personal research or study, educational, or not-for-profit purposes without prior permission or charge, provided that the authors, title and full bibliographic details are credited, a hyperlink and/or URL is given for the original metadata page and the content is not changed in any way. http://sro.sussex.ac.uk Warren McCulloch and the British Cyberneticians1 Phil Husbands and Owen Holland Dept. Informatics, University of Sussex Abstract Warren McCulloch was a significant influence on a number of British cyberneticians, as some British pioneers in this area were on him.
    [Show full text]
  • Kybernetik in Urbana
    Kybernetik in Urbana Ein Gespräch zwischen Paul Weston, Jan Müggenburg und James Andrew Hutchinson1 Paul Ernest Weston was born on February 3rd 1935 in Garland, Maine. After stu- dying physics at Wesleyan University in Middletown (Connecticut), 1952 through 1956, Paul Weston came to the University of Illinois in summer 1956, receiving his M.S. in August 1958. The following autumn, as a graduate student of physics, he attended a seminar led by Heinz von Foerster under the promising title „Cyberne- tics“.2 In June 1959 Paul joins the newly founded Biological Computer Laboratory (BCL) as a half time research assistant employed by the Electrical Engineering Department. Through roughly 1962, Weston’s primary emphasis was upon simulation of the processes of perception, with work in neuron models and pattern recognition. Together with Murray Babcock, Weston was one of the most prominent creators of prototypes built at the BCL. Already in 1961 Weston becomes widely known as the inventor of the NumaRete, a parallel operated pattern recognition machine, which played an important role in the public perception of the BCL through out the fol- lowing years.3 After 1964 Weston continues his research at BCL on the basis of a full position as a research assistant and starts to focus on information processing, particularly inte- rested in problems associated with natural-language interaction with machine. As a member of the Committee for Cognitive Studies Weston is also increasingly interes- ted in the potentials of how the new information technologies could shape society. In June 1970 Weston receives his PhD for developing a new and more efficient data structure called CYLINDER.4 After the BCL was closed in 1975 Weston continued to work as a Research Associate at the Department of Electrical Engineering and the Coordinated Science Laboratory of the University of Illinois.
    [Show full text]
  • 1011 Neurons } 1014 Synapses the Cybernetics Group
    History 329/SI 311/RCSSCI 360 Computers and the Internet: A global history Week 6: Computing and Cybernetics in the Soviet Union Today } Review } Cybernetics: minds, brains, and machines } Soviet computing and cybernetics } Soviet economic planning } Next time Review: SAGE } Origin: Whirlwind digital computer project, MIT } SAGE = Semi-Automatic Ground Environment } Computer-controlled air defense of lower 48 states } Networked over telephone lines } Duplexed (two computers for high reliability) } Tube-based central processors made by IBM } Magnetic core memory } First truly large software development } Served as a pattern for many subsequent military projects } Major factor in US and IBM dominance of commercial computer markets by late1950s Cybernetics: minds, brains, and machines Key figures } Norbert Wiener } cybernetics } Warren McCulloch & Walter Pitts } neural nets } John von Neumann } brain modeling, cellular automata, biological metaphors } Frank Rosenblatt } perceptrons The Cybernetics Group } Norbert Wiener, MIT } WWII anti-aircraft problems } Servo/organism analogies } “Behavior, Purpose, and Teleology” (1943) } Information as measurable quantity } Feedback: circular self-corrective cycles BEHAVIOR, PURPOSE AND TELEOLOGY 21 Examples of predictions of higher order are shooting with a sling or with a bow and arrow. Predictive behavior requires the discrimination of at least two coordinates, a temporal and at least one spatial axis. Prediction will be more effective and flexible, however, if the behaving object can respond to changes in more than one spatial coordinate. The sensory receptors of an organism, or the corresponding elements of a machine, may therefore limit the predictive behavior. Thus, a bloodhound follows a trail, that is, it does not show any predictive behavior in trailing, because a chemical, olfactory input reports only spatial information: distance, as indicated by intensity.
    [Show full text]
  • Journal of Sociocybernetics
    JOURNAL OF SOCIOCYBERNETICS _________________________________________________________________ Volume 3 Number 1 Spring/Summer 2002 Official Journal of the Research Committee on Sociocybernetics (RC51) of the International Sociological Association JOURNAL OF SOCIOCYBERNETICS www.unizar.es/sociocybernetics/ Editor Richard E. Lee Newsletter Cor Van Dijkum Felix Geyer Editorial Board Mike Byron Tessaleno Devezas Jorge González Bernd R. Hornung Chaime Marcuello Vessela Misheva Philip Nikolopoulos Bernard Scott Mike Terpstra ____________________________________________________________________________ The JOURNAL OF SOCIOCYBERNETICS (ISSN 1607-8667) is an electronic journal published biannually--Spring/Summer and Fall/Winter--by the Research Committee on Sociocybernetics of the International Sociological Association. MANUSCRIPT submissions should be sent electronically (in MSWord or Rich Text File format) to each of the editors: Richard E. Lee [email protected], Felix Geyer, [email protected], and Cor van Dijkum, [email protected]. In general, please follow the Chicago Manuel of Style; citations and bibliography should follow the current journal style (APA). Normally, articles should be original texts of no more than 6000 words, although longer articles will be considered in exceptional circumstances. The Journal looks for submissions that are innovative and apply principles of General Systems Theory and Cybernetics to the social sciences, broadly conceived. COPYRIGHT remains the property of authors. Permission to reprint must be obtained from the authors and the contents of JoS cannot be copied for commercial purposes. JoS does, however, reserve the right to future reproduction of articles in hard copy, portable document format (.pdf), or HTML editions of JoS. iii SOCIOCYBERNETICS traces its intellectual roots to the rise of a panoply of new approaches to scientific inquiry beginning in the 1940's.
    [Show full text]
  • Machine Learning Techniques for Equalizing Nonlinear Distortion
    Machine Learning Techniques for Equalizing Nonlinear Distortion A Technical Paper prepared for SCTE•ISBE by Rob Thompson Director, NGAN Network Architecture Comcast 1800 Arch St., Philadelphia, PA 19103 (215) 286-7378 [email protected] Xiaohua Li Associate Professor, Dept. of Electrical and Computer Engineering State University of New York at Binghamton Binghamton, NY 13902 (607) 777-6048 [email protected] © 2020 SCTE•ISBE and NCTA. All rights reserved. 1 Table of Contents Title Page Number 1. Introduction .................................................................................................................................... 4 2. Artificial Intelligence (AI) ................................................................................................................. 4 2.1. Historical Perspective ........................................................................................................ 4 2.2. Common Solutions ............................................................................................................ 4 2.3. Popular Tools/Models ........................................................................................................ 5 2.4. Biological Inspiration .......................................................................................................... 5 2.5. DOCSIS Transmit Pre-Equalization ................................................................................... 6 3. Power Amplifier (PA) Efficiency Problem .......................................................................................
    [Show full text]
  • A Defense of Pure Connectionism
    City University of New York (CUNY) CUNY Academic Works All Dissertations, Theses, and Capstone Projects Dissertations, Theses, and Capstone Projects 2-2019 A Defense of Pure Connectionism Alex B. Kiefer The Graduate Center, City University of New York How does access to this work benefit ou?y Let us know! More information about this work at: https://academicworks.cuny.edu/gc_etds/3036 Discover additional works at: https://academicworks.cuny.edu This work is made publicly available by the City University of New York (CUNY). Contact: [email protected] A DEFENSE OF PURE CONNECTIONISM by ALEXANDER B. KIEFER A dissertation submitted to the Graduate Faculty in Philosophy in partial fulfillment of the requirements for the degree of Doctor of Philosophy, The City University of New York 2019 © 2019 ALEXANDER B. KIEFER All Rights Reserved ii A Defense of Pure Connectionism By Alexander B. Kiefer This manuscript has been read and accepted by the Graduate Faculty in Philosophy in satisfaction of the dissertation requirement for the degree of Doctor of Philosophy. _________________________ _________________________________________ 1/22/2019 Eric Mandelbaum Chair of Examining Committee _________________________ _________________________________________ 1/22/2019 Nickolas Pappas Executive Officer Supervisory Committee: David Rosenthal Jakob Hohwy Eric Mandelbaum THE CITY UNIVERSITY OF NEW YORK iii ABSTRACT A Defense of Pure Connectionism by Alexander B. Kiefer Advisor: David M. Rosenthal Connectionism is an approach to neural-networks-based cognitive modeling that encompasses the recent deep learning movement in artificial intelligence. It came of age in the 1980s, with its roots in cybernetics and earlier attempts to model the brain as a system of simple parallel processors.
    [Show full text]
  • Integration Benchmarks for Computer Algebra Systems
    The Electronic Journal of Mathematics and Technology, Volume 2, Number 3, ISSN 1933-2823 Integration on Computer Algebra Systems Kevin Charlwood e-mail: [email protected] Washburn University Topeka, KS 66621 Abstract In this article, we consider ten indefinite integrals and the ability of three computer algebra systems (CAS) to evaluate them in closed-form, appealing only to the class of real, elementary functions. Although these systems have been widely available for many years and have undergone major enhancements in new versions, it is interesting to note that there are still indefinite integrals that escape the capacity of these systems to provide antiderivatives. When this occurs, we consider what a user may do to find a solution with the aid of a CAS. 1. Introduction We will explore the use of three CAS’s in the evaluation of indefinite integrals: Maple 11, Mathematica 6.0.2 and the Texas Instruments (TI) 89 Titanium graphics calculator. We consider integrals of real elementary functions of a single real variable in the examples that follow. Students often believe that a good CAS will enable them to solve any problem when there is a known solution; these examples are useful in helping instructors show their students that this is not always the case, even in a calculus course. A CAS may provide a solution, but in a form containing special functions unfamiliar to calculus students, or too cumbersome for students to use directly, [1]. Students may ask, “Why do we need to learn integration methods when our CAS will do all the exercises in the homework?” As instructors, we want our students to come away from their mathematics experience with some capacity to make intelligent use of a CAS when needed.
    [Show full text]
  • Intelligence Without Representation: a Historical Perspective
    systems Commentary Intelligence without Representation: A Historical Perspective Anna Jordanous School of Computing, University of Kent, Chatham Maritime, Kent ME4 4AG, UK; [email protected] Received: 30 June 2020; Accepted: 10 September 2020; Published: 15 September 2020 Abstract: This paper reflects on a seminal work in the history of AI and representation: Rodney Brooks’ 1991 paper Intelligence without representation. Brooks advocated the removal of explicit representations and engineered environments from the domain of his robotic intelligence experimentation, in favour of an evolutionary-inspired approach using layers of reactive behaviour that operated independently of each other. Brooks criticised the current progress in AI research and believed that removing complex representation from AI would help address problematic areas in modelling the mind. His belief was that we should develop artificial intelligence by being guided by the evolutionary development of our own intelligence and that his approach mirrored how our own intelligence functions. Thus, the field of behaviour-based robotics emerged. This paper offers a historical analysis of Brooks’ behaviour-based robotics approach and its impact on artificial intelligence and cognitive theory at the time, as well as on modern-day approaches to AI. Keywords: behaviour-based robotics; representation; adaptive behaviour; evolutionary robotics 1. Introduction In 1991, Rodney Brooks published the paper Intelligence without representation, a seminal work on behaviour-based robotics [1]. This paper influenced many aspects of research into artificial intelligence and cognitive theory. In Intelligence without representation, Brooks described his behaviour-based robotics approach to artificial intelligence. He highlighted a number of points that he considers fundamental in modelling intelligence.
    [Show full text]