Spiking Neuromorphic Architecture for Associative Learning
Total Page:16
File Type:pdf, Size:1020Kb
Spiking neuromorphic architecture for associative learning Dissertation document submitted to the Graduate School of the University of Cincinnati in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Department of Electrical Engineering and Computer Science of the College of Engineering and Applied Sciences. Author: Alexander Jones M.S. University of Cincinnati 2016 B.S. University of Cincinnati 2015 September 29, 2020 Committee: Rashmi Jha (Chair), Marc Cahay, Manish Kumar, Cory Merkel, and Ali Minai Abstract The work shown in this dissertation demonstrates the implementation of a specialized neural network for associative memory within novel neuromorphic hardware. The architecture is implemented using CMOS-based circuitry for information processing and memristive devices for the network’s memory. The architecture is based on a non-Von Neumann version of computer architecture called in-memory computing where information storage and processing reside within a single location. The CMOS circuitry within the architecture has both digital and analog components to perform processing. The memristive devices used in the architecture are a newer form of memristive device that possesses a gate that is used to potentiate/depress the device. These gated-memristive devices allow for simpler hardware architectures for tasks such as reading/writing to a device simultaneously. The architecture demonstrated here uses a property that is often seen within various memristive devices where the state is semi-volatile. This semi- volatile state can be used in tandem with a spiking neuromorphic architecture to perform unique tasks during learning depending on the degree of volatility in the device. Once memories are programmed into the network, it can then later recall previously seen memories by observing partial information from them and performing pattern completion. The final portion of this dissertation focuses on studying how the network behaves when exposed to a larger dataset of information over time and analyzing how the network performs recall on that data. An array of metrics will be used to evaluate the network’s performance during these tests, and potential expansions of network functionality are explored and studied in order to enhance its capabilities in certain applications. ii iii Thank you to all my friends, family, and research colleagues for all the love, support, and advice through the development of this work. iv Funding Acknowledgement The work in this dissertation was supported by the National Science Foundation under the following award numbers: ECCS 1156294 SHF-1718428 ECCS 1926465 v Table of Contents I. Introduction………………………………………………..…………………………………1 II. Background…………………………………………………..……………………………….5 a. A Brief History/Overview of Neural Networks………………………………….………..5 b. History of Neuromorphic Computing……………………………………………………...6 c. Approaches to Neuromorphic Computing…………………………………………………7 d. Synaptic Devices…………………………………………………………………………..9 e. Neuron Circuits and Devices……………………………………………………………..13 f. Application Space of Neuromorphic Computing………………………………………...14 III. The Neuron Circuit…………………………………..…………………………………..….16 a. The Original Octopus Retina Circuit……………………………………………………..16 b. Expanding the Octopus Retina Circuit…………………………………………………...16 c. Circuit Profiles…………………………………………………..…………..…………...18 IV. Synaptic Devices……………………………………………………………………..………23 a. Introduction to Gated-Synaptic Devices………………………………………………….23 b. Superiority of Gated-Synaptic Devices…………………………………………………..24 c. Initial Device Model…………………………………………..………………………….26 d. Results of Initial Version……………………………………….…….………………..…27 e. Generic Synaptic Model…………………………………….……………………………30 f. Generic Model Results……………………………………….…………………………..44 g. Using the Generic Model for a NbOx Gated-Synaptic Device…………………………..51 V. Architecture………………………………………………………………………..………..57 a. Associative Memory…………………………………….……………………………….57 b. Recurrent/Hopfield Networks……………………..……………………………………..58 c. The Segmented Attractor Network………………………………………………………58 d. Implementing Network in Hardware Using the Initial Device Model……………………61 e. Using SAN for Navigation……………………………………………………………….64 f. Using the Generic Model in a Segmented Attractor Network……………………………67 g. Results of the Generic Model Segmented Attractor Network……………………………70 VI. Analysis of the Segmented Attractor Network…………………………………………….76 a. Importance of Synaptic Values…………………………………………………………..76 b. Recalling Different Amounts of Input……………………………………………………83 c. Varying the Segmented Attractor Network’s Dimensions……………………………….84 d. Implementing a Dataset onto the Segmented Attractor Network…………………………86 e. The EHoS Dataset………………………………………………………………………..87 f. Demonstration of the EHoS Dataset on the Segmented Attractor Network……………..90 g. Expanding the Segmented Attractor Network’s Behavior………………………………103 h. Predicting Future Memories…………………………………………………………….108 i. Erasing Common Information Observed……………………………………………….117 j. Forgetting Memories Over Time……………………………………………………….126 k. Using Behavior Ensembles……………………………………………………………..133 VII. Conclusion………………………………………………………………………………….146 vi List of Figures II. Background Figure 1: Diagrams of two-terminal, three-terminal, and four-terminal synaptic devices III. The Neuron Circuit Figure 2: Schematic of the SR Octopus Retina neuron circuit Figure 3: Frequency/duty cycle profiles of the SR Octopus Retina neuron circuit (ASU node) Figure 4: Power consumption profile of SR Octopus Retina neuron circuit (ASU node) Figure 5: Spiking energy efficiency comparison for SR Octopus Retina neuron circuit IV. Synaptic Devices Figure 6: Read/write comparison diagram for two and three terminal synaptic devices Figure 7: Potentiation/decay demonstration of double-gated synaptic device model Figure 8: Port diagram for the behavioral gated-synaptic device model Figure 9: Variation of gc during potentiation for gated-synaptic device model Figure 10: Variation of brev during a vin sweep for gated-synaptic device model Figure 11: Variable impact diagram for gated-synaptic device model Figure 12: Experiment replications using the gated-synaptic device model Figure 13: Pulse count fit of gated-synaptic device model to niobium oxide device Figure 14: Voltage sweep fit of gated-synaptic device model to niobium oxide device Figure 15: State decay fit of gated-synaptic device model to niobium oxide device V. Architecture Figure 16: Diagram of an example segmented attractor network Figure 17: Two-memory demonstration of an attractor network using double-gated model Figure 18: Layout diagrams for navigational neuromorphic architecture Figure 19: Results of navigation test using the double-gated synaptic model Figure 20: Hardware diagram of a segmented attractor network using gated-synaptic devices Figure 21: Frequency/duty cycle profiles of SR Octopus Retina neuron circuit (TSMC node) Figure 22: Frequency response of segmented attractor network during association Figure 23: Synaptic heatmap of segmented attractor network during association Figure 24: Frequency response of segmented attractor network during recall VI. Analysis of the Segmented Attractor Network Figure 25: Flowchart for generic segmented attractor network simulations Figure 26: Hit rate of segmented attractor network as its size is increased Figure 27: Hit rate of segmented attractor network as its size is increased (lower weight) Figure 28: Hit rate of segmented attractor network as synaptic weight is varied Figure 29: Hit rate of segmented attractor network while varying amount of hidden input Figure 30: Hit rate of segmented attractor network while varying set count Figure 31: Hit rate of segmented attractor network while varying features per set count Figure 32: Average Hamming distance of all memories in the EHoS dataset Figure 33: Scaled diagram of SAN for EHoS dataset Figure 34: Flowchart for the SAN simulations on the EHoS dataset Figure 35: Hit rate during a basic SAN EHoS dataset run Figure 36: Unique memory ratio during a basic SAN EHoS dataset run Figure 37: Memory occurrences during a basic SAN EHoS dataset run vii Figure 38: Hit rate of SAN EHoS dataset run with varied input Figure 39: Unique memory ratio of SAN EHoS dataset run with varied input Figure 40: Memory occurrences of SAN EHoS dataset run with varied input Figure 41: Expanded learning behavior diagrams for the segmented attractor network Figure 42: Additional circuitry required for predictive memory behavior Figure 43: Hit rate of SAN EHoS dataset run with predictive memory behavior Figure 44: Unique memory ratio of SAN EHoS dataset run with predictive memory behavior Figure 45: Memory occurrences of SAN EHoS dataset run with predictive behavior Figure 46: Hit rate of SAN EHoS dataset run with low predictive behavior Figure 47: Unique memory ratio of SAN EHoS dataset run with low predictive behavior Figure 48: Memory occurrences of SAN EHoS dataset run with low predictive behavior Figure 49: Hit rate of SAN EHoS dataset run with erase behavior Figure 50: Unique memory ratio of SAN EHoS dataset run with erase behavior Figure 51: Memory occurrences of SAN EHoS dataset run with erase behavior Figure 52: Hit rate of SAN EHoS dataset run with high erase behavior Figure 53: Unique memory ratio of SAN EHoS dataset run with high erase behavior Figure 54: Memory occurrences of SAN EHoS dataset run with high erase behavior Figure 55: Hit rate of SAN EHoS dataset run with forgetting behavior Figure 56: Unique memory ratio of SAN EHoS dataset run with forgetting