Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition

Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition

Spiking Deep Neural Networks: Engineered and Biological Approaches to Object Recognition by Eric Hunsberger A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Doctor of Philosophy in Systems Design Engineering Waterloo, Ontario, Canada, 2017 c Eric Hunsberger 2017 Examining Committee Membership The following served on the Examining Committee for this thesis. The decision of the Examining Committee is by majority vote. External Examiner Bruno Olshausen Professor, Helen Wills Neuroscience Institute and School of Optometry, University of California, Berkeley Supervisor Chris Eliasmith Professor, Department of Philosophy and Department of Systems Design Engineering, University of Waterloo Supervisor Jeff Orchard Associate Professor, Cheriton School of Computer Science, University of Waterloo Internal Member Bryan Tripp Associate Professor, Department of Systems Design Engineering, University of Waterloo Internal Member Alexander Wong Associate Professor, Department of Systems Design Engineering, University of Waterloo Internal-External Member Steven Waslander Associate Professor, Department of Mechanical and Mechatronics Engineering, University of Waterloo ii I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis will be made electronically available to the public. iii Abstract Modern machine learning models are beginning to rival human performance on some re- alistic object recognition tasks, but we still lack a full understanding of how the human brain solves this same problem. This thesis combines knowledge from machine learning and computational neuroscience to create models of human object recognition that are increasingly realistic both in their treatment of low-level neural mechanisms and in their reproduction of high-level human behaviour. First, I present extensions to the Neural Engineering Framework to make its preferred type of model|the “fixed-encoding" network|more accurate for object recognition tasks. These extensions include better distributions|such as Gabor filters|for the encoding weights, and better loss functions|namely weighted squared loss, softmax loss, and hinge loss|to solve for decoding weights. Second, I introduce increased biological realism into deep convolutional neural networks trained with backpropagation, by training them to run using spiking leaky integrate-and- fire (LIF) neurons. Convolutional neural networks have been successful in machine learn- ing, and I am able to convert them to spiking networks while retaining similar levels of performance. I present a novel method to smooth the LIF rate response function in order to avoid the common problems associated with differentiating spiking neurons in general and LIF neurons in particular. I also derive a number of novel characterizations of spiking variability, and use these to train spiking networks to be more robust to this variability. Finally, to address the problems with implementing backpropagation in a biological system, I train spiking deep neural networks using the more biological Feedback Alignment algorithm. I examine this algorithm in depth, including many variations on the core algorithm, methods to train using non-differentiable spiking neurons, and some of the limitations of the algorithm. Using these findings, I construct a spiking model that learns online in a biologically realistic manner. The models developed in this thesis help to explain both how spiking neurons in the brain work together to allow us to recognize complex objects, and how the brain may learn this behaviour. Their spiking nature allows them to be implemented on highly efficient neuromorphic hardware, opening the door to object recognition on energy-limited devices such as cell phones and mobile robots. iv Acknowledgements First and foremost, I would like to thank my supervisors, Chris Eliasmith and Jeff Orchard. Their guidance, ideas, and keen questions have pushed me to look more deeply at some of the more pressing and interesting problems in the field. Chris has gone beyond his duties by being not only a great mentor, but also a friend. He has filled the CNRG lab at UW with amazing people, and I would like to thank everyone for her or his unique contribution to making the lab both a fertile ground for new and exciting ideas, and a fun and awesome place to be. I owe a special gratitude to those who first welcomed me to the lab, and who greatly influenced my early growth as a researcher and software developer: Trevor Bekolay, Terry Stewart, Dan Rasmussen, Travis DeWolf, Xuan Choo, and James Bergstra. More recently, thanks to Peter Duggins, Mariah Martin Shein, and Peter Blouw for the numerous musical interludes that helped me survive the last months of writing. Many of these past and present CNRG members also form the core development team for Nengo. Without this outstanding piece of software, I could not have done all the things I did in this thesis, so a special thanks to the whole Nengo team. I was also fortunate to have such a great committee to help me through the whole process. In particular, I would like to acknowledge Bryan Tripp, who has for years provided insightful comments, feedback, and ideas across many areas of my research. Finally, I could not have come this far without the continued support of my family and friends. Thank you for being genuinely interested in what I do, even when I am sure it sounds esoteric or downright dull. A special thanks to Sean Anderson, Matt McGill, Matt Wilson, and Nick Fischer, for over 75 combined years of being great friends to me. A very special thanks my family: Brian, Jocelyn, Alex, and Marg. And the specialest thanks to Ruth, who has not borne the burden with me as long as some of the others, but who has helped to carry so much more of it. v Dedication Half-way through this thesis project|which one may know is often the point when moti- vation begins to wane|I received the following fortune inside a cookie: If the brain were so simple we could understand it, we would be so simple we couldn't. Since then, I have kept this fortune on my desk to motivate me. I dedicate this thesis to all of us who work to prove this fortune cookie wrong. vi Table of Contents Examining Committee Membership ii Author's Declaration iii Abstract iv Acknowledgementsv Dedication vi Table of Contents vii List of Tables xiii List of Figures xiv 1 Introduction1 1.1 Outline......................................3 2 Computational Neuroscience5 2.1 Biology......................................6 2.1.1 Neuron physiology...........................7 2.1.2 Visual system anatomy......................... 10 vii 2.2 Neuron models................................. 12 2.2.1 The integrate-and-fire neuron model................. 14 2.2.2 The leaky integrate-and-fire (LIF) model............... 16 2.3 Synapse models................................. 18 2.4 Rate codes, population codes, and timing codes............... 19 2.5 The Neural Engineering Framework (NEF).................. 21 2.6 Neuromorphic hardware............................ 23 3 Machine Learning 24 3.1 Machine learning basics............................ 25 3.1.1 Goal................................... 26 3.1.2 Learning paradigms........................... 26 3.1.3 Basic components............................ 27 3.1.4 Overfitting and underfitting...................... 28 3.1.5 Dataset usage.............................. 30 3.1.6 Objective functions........................... 31 3.1.7 Datasets................................. 32 3.2 Backpropagation................................ 36 3.3 Stochastic gradient descent (SGD)....................... 39 3.3.1 Extensions................................ 40 3.3.2 Gradients and initialization...................... 42 3.4 Convolutional neural networks (CNNs).................... 43 3.4.1 Convolution operation......................... 43 3.4.2 Convolutional layer........................... 45 3.4.3 Nonlinearities.............................. 46 3.4.4 Pooling layer.............................. 48 3.4.5 Local response normalization layer.................. 50 3.4.6 Dropout layer.............................. 51 viii 4 Fixed-Encoding Networks 52 4.1 Background................................... 53 4.1.1 The Neural Engineering Framework.................. 53 4.1.2 Extreme Learning Machines...................... 54 4.2 Encoding methods............................... 56 4.2.1 Independent-element random encoders................ 57 4.2.2 Receptive fields............................. 57 4.2.3 Gabor filters............................... 57 4.2.4 Computed Input Weights (CIW)................... 58 4.2.5 Constrained Difference weights (CD)................. 60 4.3 Decoding methods............................... 61 4.3.1 Squared loss............................... 62 4.3.2 Weighted squared loss......................... 64 4.3.3 Softmax loss............................... 65 4.3.4 Hinge loss................................ 66 4.3.5 Weight norm regularization...................... 68 4.4 Spiking methods................................ 70 4.5 Results...................................... 72 4.5.1 Encoding................................ 72 4.5.2 Decoding................................ 75 4.5.3 Spiking................................. 77 4.5.4 Computation time........................... 82 4.6 Discussion.................................... 85 5 Spiking Deep Networks 89 5.1 Background..................................

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    212 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us