Computational Neuroscience a First Course Springer Series in Bio-/Neuroinformatics
Total Page:16
File Type:pdf, Size:1020Kb
Springer Series in Bio-/Neuroinformatics 2 Hanspeter A. Mallot Computational Neuroscience A First Course Springer Series in Bio-/Neuroinformatics Volume 2 Series Editor N. Kasabov For further volumes: http://www.springer.com/series/10088 Hanspeter A. Mallot Computational Neuroscience A First Course ABC Hanspeter A. Mallot Department of Biology University of Tübingen Tübingen Germany ISSN 2193-9349 ISSN 2193-9357 (electronic) ISBN 978-3-319-00860-8 ISBN 978-3-319-00861-5 (eBook) DOI 10.1007/978-3-319-00861-5 Springer Cham Heidelberg New York Dordrecht London Library of Congress Control Number: 2013939742 c Springer International Publishing Switzerland 2013 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. Exempted from this legal reservation are brief excerpts in connection with reviews or scholarly analysis or material supplied specifically for the purpose of being entered and executed on a computer system, for exclusive use by the purchaser of the work. Duplication of this publication or parts thereof is permitted only under the provisions of the Copyright Law of the Publisher’s location, in its current version, and permission for use must always be obtained from Springer. Permissions for use may be obtained through RightsLink at the Copyright Clearance Center. Violations are liable to prosecution under the respective Copyright Law. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. While the advice and information in this book are believed to be true and accurate at the date of pub- lication, neither the authors nor the editors nor the publisher can accept any legal responsibility for any errors or omissions that may be made. The publisher makes no warranty, express or implied, with respect to the material contained herein. Printed on acid-free paper Springer is part of Springer Science+Business Media (www.springer.com) Foreword Computational approaches have been an ever increasing trend in neuroscience re- search since the publication of Norbert Wiener’s seminal book in 1948. Already in its title, Cybernetics: or control and communication in the animal and the machine, this book makes a central claim which is valid to this day: Information processing in biology and technology is based on the same principles and procedures, at least if an appropriate level of description is selected. This “essential unity of problems ... whether in the machine or in living tissue” (p. 11 of the 1961 edition) implies two research directions, i.e. the application of mathematics and systems theory to the analysis of neural systems and, reversely, the application of neural procedures to the solution of technical problems. The first approach proved to be very successful. The analysis of sensors and sensoric processing, motor control, and the principles of learning and adaptation in modular systems are now well understood and parame- terized, largely based on available mathematical tools. The opposite approach went through ups and downs. In the attempt to exploit the alleged superiority of neural systems, subproblems such as pattern recognition, localization, or learning in neural nets were singled out and their respective neural substrates were analyzed—often with disappointing results: whenever a well conditioned objective function could be defined, optimal realizations by means of technical algorithms and procedures turned out to be more efficient. More recently interest has shifted to machine learn- ing and statistical learning theory, keeping only loose connections to neuroscience and biology. There have, however, been attempts to make neuroscience and compu- tation re-converge, mostly by means of a unified theoretical treatment of the various subproblems mentioned above. The present book is part of this endeavor. Good scientific books pave a way of thinking which makes concepts understand- able, quantifies the facts, and puts them in adequate perspective. Professor Mallot has done just this. Starting with the intricate electrochemical properties of single cells, he proceeds to the connectivity of neurons. Their space-time dependent re- sponse properties represent the essence of all neural systems. The learning depen- dent variations of these nets are also treated as an essential aspect. The associative memory is one example of this property. To understand the coupling of real spa- tially two-dimensional structures, one has to include some additional aspects like VI Foreword population coding or the geometric and parametric mappings that facilitate compu- tations in two-dimensional representations. The application-oriented claim of cyber- netics is treated in a chapter on artificial neural nets. The author, who also published important contributions to the entire theoretical descriptions of neural systems, has the rare talent to put facts and their relations in mathematical terms in a way that is easily understood. Also, the handling of the problems with continuous as well as with discrete methods and the introduction of necessary non-linearities into the description can be captured without special training. Recent progress in experimental methods has provided exciting new insights into the structure of the brain on all levels of complexity and elucidated the efficiency and the variability of the systems. Theoretical concepts have become more profound and more adequate for dealing with biological constraints. Self-organization and life-long adaptation of the brain proved to determine both its anatomical structures and the way data and knowledge are represented. The underlying processes—and with that the decomposition of tasks—seem to be based on a relatively uniform set of learning procedures and anatomical architectures (such as cortices), both fa- voring the flexibility of the entire systems. Thus, brains and the information they contain are in a permanent process of adaptation to their current goal. This process of self-organization seems to be the one driving force of both brain development and learning. The self-organization of neural systems is of course of substantial interest for technical systems with their ever-increasing complexity and diversity. Self-organiza- tion of neuron-like structures based on their innate flexibility may lead a way to reduced design efforts. In this sense, the original quest of cybernetics is still alive, albeit with an extended focus on the whole, behaving organism: How do neural systems succeed to autonomously overcome their ignorance and to exploit the pre- dictabilities of their environment for optimal behavior and, indeed, for evolutionary success? Evolution and ecology prove, of course, that solutions to these questions exist. Not surprisingly, the computational methods used in the neural and behavioral sciences cover the major part of scientific computing at large. The book focuses on the most important approaches and lays the foundations for extended questions and specifications. It shows that it is not only the detailed mathematical formalisms that are crucial but also the broader concepts which allow appropriate mathematizations of neuroscientific phenomena and concepts. This book delivers a clear and under- standable introduction into the structure of the problems of a scientific domain that still belongs to the most interesting fields science has to offer. Bochum and Mainz, April 2013 Werner von Seelen Institute for Neural Computation Ruhr-University Bochum Preface Mathematical modeling in neuroscience is by now a common tool in many research projects and laboratories; its coverage in neuroscience curricula is therefore increas- ing. At the same time, computational neuroscience is still largely considered a do- main of specialists who have acquired a mathematical background in physics or engineering and subsequently came to apply these ideas to questions of the physi- ology of neurons and neural networks and of the information processing carried out by these structures. This book attempts an inverse approach. It grew out of a lecture series delivered over a period of more than ten years to graduate students in neuroscience, most of which had backgrounds in biology, medicine, psychology, or cognitive science. The required mathematical prerequisites for the course were therefore limited to the level of a bachelor degree in science, i.e. to basic calculus and linear algebra. All mathematical issues beyond this level—such as differential equations, convolu- tion, complex numbers, high-dimensional vector spaces, or the statistical informa- tion measure—are thoroughly motivated and explained as they arise. I tried to keep these explanations mathematically clean but in cases had to omit subtleties which a full mathematical treatment ought to provide. The text reflects extensive class-room discussions and presents routes of explanation accessible also to mathematical non- experts. This is particularly true for the chapter on Fourier transforms which is the least neuroscientific one