<<

Theoretical : Computational and Mathematical Modeling of Neural Systems

Peter Dayan and L. F. Abbott MIT Press, Cambridge, $50.00 ISBN: 0-262-04199-5 460 pages Downloaded from http://mitprc.silverchair.com/jocn/article-pdf/15/1/154/1757738/089892903321107891.pdf by guest on 18 May 2021

Every field of science relies on having its trusted sources all of these with a level of depth and thoroughness that

of knowledge, the books that unite investigators with a is impressive. Downloaded from http://direct.mit.edu/jocn/article-pdf/15/1/154/1934364/089892903321107891.pdf by guest on 02 October 2021 common language and provide them with the basic There tend to be two camps in the field of computa- toolbox for approaching problems. Physics, for instance, tional neuroscience, and they are probably best exem- has its ‘‘Landau and Lifschitz’’; electrical engineers rou- plified by how they use the term ‘‘computation.’’ In one, tinely turn to ‘‘Horowitz and Hill’’; and many a neuro- mathematical models are constructed primarily to scientist was brought up on ‘‘Kandel and Schwartz.’’ describe or characterize existing data, and computation Now, at last, the field of computational neuroscience has is used mainly as a means to simulate or analyze the one of its own with the recent publication of Dayan and data, in rather the same way as computational chemistry Abbott’s Theoretical Neuroscience: Computational and or computational fluid dynamics. In the other camp, Mathematical Modeling of Neural Systems. computation is applied in a more theoretical manner, The emergence of this book represents more than the as a metaphor for what the brain is actually doing. usual feat of textbook publication. It is a significant The first six chapters of the book fall more in the first development for the field as a whole, because up to category, by using mathematical and computational now there has been no single book that unites the basic techniques to characterize neural function. Some of methods and models of computational neuroscience in these methods attempt to describe neural function in one place. Those who teach courses on computational computational terms, but they are still primarily descrip- models have mostly hobbled along by copying papers tive in nature. These include methods for characterizing and chapters from assorted journals and books, or by spike statistics, reverse correlation techniques for meas- writing their own elaborate lecture notes. While there uring receptive field properties, methods for decoding exist several excellent texts on neural networks—such as information contained in neural spike trains, and infor- Introduction to the Theory of Neural Computation mation theoretic techniques for measuring information (Hertz, Krogh, & Palmer), Neural Networks for Pattern capacity of neurons and coding efficiency. There are also Recognition (Bishop), and Neural Networks: A Compre- two chapters covering detailed electrical models of hensive Foundation (Haykin)—they are mostly written neurons, including channel kinetics, synapses, and cable from the perspective of engineering, math, or physics, properties. All of these techniques are covered with the and so they do not make serious connections to neuro- kind of nuts-and-bolts detail that will allow readers to science. Others that do address brain function are either begin implementing and experimenting with them in tilted more towards cognitive science, emphasizing high- computer simulations. er-level aspects of brain function—such as Parallel The last four chapters of the book fall into the second Distributed Processing (Mclelland & Rumelhart), camp of computational neuroscience, presenting more An Inroduction to Neural Networks (Anderson), and abstract models that extrapolate beyond the available An Introduction to Natural Computation (Ballard)— data. Experimentalists often recoil at the idea of enter- or else towards lower-level cellular models—as in The taining such models, but they are every bit as essential as Biophysics of Computation: Information Processing the descriptive models because they provide a theoret- in Single Neurons (Koch), and Spikes: Exploring the ical framework for interpreting data and motivating Neural Code (Rieke et al.). future experiments. These chapters discuss recurrent What sets Dayan and Abbott’s book apart is that it network models with attractor dynamics, models of lands right smack in the center of computational learning and adaptation, and theories of representation neuroscience, spanning the entire range from models based on probabilistic models. Many of these topics are at the cellular level, such as ion channel kinetics, to also covered in the more traditional books on neural those at the cognitive level, such as reinforcement networks, but the advantage of the presentation here is learning. It also does a beautiful job explaining state- that it makes more direct contact with neuroscientific of-the-art techniques in neural coding, as well as recent data. Also, by using terminology and mathematical nota- advances in models. And it does tion that is consistent throughout the book, the authors

D 2003 Massachusetts Institute of Technology Journal of 15:1, pp. 154–155 help to bring the two camps of computational neuro- further background on topics such as linear algebra, science under one roof. statistics, and optimization. Of course, many will ask, ‘‘will I need to know a lot of In short, this is a book where students and grown math to understand this book?’’ Indeed, some pages scientists alike can turn to learn the main methods and are rife with equations and formulas, but this is un- ideas of computational neuroscience. It is a suitable text avoidable since mathematics provides the most com- for graduate students in neuroscience and psychology, pact and precise language for discussing the workings and it could probably be used for teaching advanced of a complex system such as the brain. At the same undergraduates as well. I have just finished using it for a time, though, the authors have gone out of their way to graduate-level computational neuroscience course, and Downloaded from http://mitprc.silverchair.com/jocn/article-pdf/15/1/154/1757738/089892903321107891.pdf by guest on 18 May 2021 make the book accessible to a wide audience, and there it went over quite well with students coming from is very little here that requires more than first-year diverse backgrounds. I predict that it will become the calculus or simple linear algebra to understand. Admir- mainstay for courses on computational neuroscience for ably, the text does not use the equations as a crutch, many years to come.

but rather strives to explain in words what is meant or Downloaded from http://direct.mit.edu/jocn/article-pdf/15/1/154/1934364/089892903321107891.pdf by guest on 02 October 2021 implied by them without sacrificing mathematical rigor. Reviewed by Bruno A. Olshausen The appendices also provide excellent tutorials and University of California–Davis

Olshausen 155