Fundamentals of Computational Neuroscience 2E

Total Page:16

File Type:pdf, Size:1020Kb

Fundamentals of Computational Neuroscience 2E Fundamentals of Computational Neuroscience 2e December 28, 2009 Chapter 4: Associators and synaptic plasticity Types of plasticity I Structural plasticity is the mechanism describing the generation of new connections and thereby redefining the topology of the network. I Functional plasticity is the mechanism of changing the strength values of existing connections. Hebbian plasticity ”When an axon of a cell A is near enough to excite cell B or repeatedly or persistently takes part in firing it, some growth or metabolic change takes place in both cells such that A’s efficiency, as one of the cells firing B, is increased.” Donald O. Hebb, The organization of behavior, 1949 See also Sigmund Freud, Law of association by simultaneity, 1888 Association in ri wi out r Neuron model: In each time step the model neurons fires if P in i wi ri > 1:5 Learning rule: Increase the strength of the synapses by a value ∆w = 0:1 if a presynaptic firing is paired with a postsynaptic firing. A. Before learning, B. Before learning, C. After 1 learning D. After 6 learning steps, only adour cue only visual cue step, both cues only visual cue in in in in r w r w r w r w 1 1 0 1 1 1.1 0 1.6 0 0 0 0 0 0 0 0 1 1 0 1 1 1.1 0 1.6 0 0 0 0 0 0 0 0 1 1 0 1 1 1.1 0 1.6 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0.1 1 0.6 0 0 1 0 1 0.1 1 0.6 0 0 1 0 1 0.1 1 0.6 out out out out r = 1 r = 0 r = 1 r = 1 Features of associators and Hebbian learning I Pattern completion and generalization I Prototypes and extraction of central tendencies I Graceful degradation and fault tolerance Classical LTP and LTD A. Long term potentiation B. Long term depression 140 180 120 160 140 100 80 120 100 60 80 40 -10 0 10 20 30 -10 0 10 20 30 Change in EFP amplitude [%] Change Change in EFP amplitude [%] Change Time [min] Time [min] Synaptic neurotransmitter release probability High-frequency stimulation 3 0 EPSP [mV] 100 Time [ms] 100 0 Fluorescence [%] Fluorescence 100 ∆ Time [ms] 150 100 50 p=0.4 p=0.76 p=0.96 Fluorescence [%] Fluorescence ∆ 0 0 5 45 50 90 95 Time [min] Spike timing dependent plasticity A. Spike timing dependent plasticity 100 LTP 80 Δw 60 40 20 t post − pret 0 LTD −20 0 −60 Change in EPSC amplitude [%] −80 −80 −40 0 40 80 t post − pret [ms] C. B. LTP LTP Δw Δw − t post − pret t post pret LTD LTD 0 0 D. LTP E. LTP Δw Δw t post − pret t post − pret LTD LTD 0 0 The calcium hypothesis and modelling chemical pathways D1R mGluR GluR VDCC Gq Golf PLC AC IP3 AMPA cAMP ER Ca2+ CaMKII PDE Ca LTP Store high PKA CaM [Ca 2+ ] PP2A PP2B NMDA Mg low Positive Feedback LTD Loop Thr-75 CDK5 CK1 VGCC DARPP-32 Ser-137 PP2C Thr-34 PP1 I-1 Mathematical formulation of Hebbian plasticity f f wij (t + ∆t) = wij (t) + ∆wij (ti ; tj ; ∆t; wij ): post pre ± ± ∓ t −t post pre τ± ∆wij = (w)e Θ(±[t − t ]): Additive rule with hard (absorbing) boundaries: a± for w min ≤ w ≤ w max ± = ij ij ij ; 0 otherwise Multiplicative rule (soft boundaries): + + max = a (w − wij ) − − min = a (wij − w ): (1) Hebbian learning in population and rate models General: ∆wij = (t; w)[fpost(ri )fpre(rj ) − f (ri ; rj ; w)] Mnemonic equation (Caianiello): ∆wij = (w)[ri rj − f (w)] Basic Hebb: ∆wij = ri rj Covariance rule: ∆wij = (ri − hri i)(rj − hrj i) BCM M BCM theory: ∆wij = (f (ri ; θ )(rj ) − f (w)) − + pre ABS rule: ∆wij = (fABS(ri ; θ ; θ )sign(rj − θ )) Function used in BCM rule Function used in basic ABS rule Threshold for LTP + θ Threshold for LTD − BCM θ ABS f f Μ θ rate r Calcium concentration [Ca 2+ ] Synaptic scaling and weight distributions A. Learning phase B. Weight distribution after learning 250 0.8 250 0.7 200 200 0.6 150 0.5 150 V CV 0.4 C Number 100 100 0.3 Firing rate [Hz] 0.2 50 Firing rate 50 0.1 0 0 0 0 0.5 1 1.5 2 2.5 3 0 0.005 0.01 0.015 Time [min] Weight values after Song, Miller and Abbott 2000 Hebbian rate rules on random pattern 1 X µ µ wij = p (ri − hri i)(rj − hrj i): Np µ 0.15 0.1 Probability 0.05 0 -10 0 10 Synaptic weight Synaptic scaling and PCA wij Explicit normalization: wij P j wij Basic decay: ∆wij = ri rj − cwij Willshaw rule: ∆wij = (ri − wij )rj 2 Oja rule: ∆wij = ri rj − (ri ) wij 1 0.5 x pre r2 0 w2 −0.5 −1 −1 −0.5 0 0.5 1 pre r1 , w1 Further Readings Laurence F. Abbott and Sacha B. Nelson (2000), Synaptic plasticity: taming the beast, in Nature Neurosci. (suppl.), 3: 1178–83. Alain Artola and Wolf Singer (1993), Long-term depression of excitatory synaptic transmission and its relationship to long-term potentiation, in Trends in Neuroscience 16: 480–487. Mark C. W. van Rossum, Guo-chiang Bi, and Gina G. Turrigiano (2000) Stable Hebbian learning from spike timing-dependent plasticity, in J. Neuroscience 20(23): 8812–21.
Recommended publications
  • Ce Document Est Le Fruit D'un Long Travail Approuvé Par Le Jury De Soutenance Et Mis À Disposition De L'ensemble De La Communauté Universitaire Élargie
    AVERTISSEMENT Ce document est le fruit d'un long travail approuvé par le jury de soutenance et mis à disposition de l'ensemble de la communauté universitaire élargie. Il est soumis à la propriété intellectuelle de l'auteur. Ceci implique une obligation de citation et de référencement lors de l’utilisation de ce document. D'autre part, toute contrefaçon, plagiat, reproduction illicite encourt une poursuite pénale. Contact : [email protected] LIENS Code de la Propriété Intellectuelle. articles L 122. 4 Code de la Propriété Intellectuelle. articles L 335.2- L 335.10 http://www.cfcopies.com/V2/leg/leg_droi.php http://www.culture.gouv.fr/culture/infos-pratiques/droits/protection.htm Ecole´ doctorale IAEM Lorraine D´epartement de formation doctorale en informatique Apprentissage spatial de corr´elations multimodales par des m´ecanismes d'inspiration corticale THESE` pour l'obtention du Doctorat de l'universit´ede Lorraine (sp´ecialit´einformatique) par Mathieu Lefort Composition du jury Pr´esident: Jean-Christophe Buisson, Professeur, ENSEEIHT Rapporteurs : Arnaud Revel, Professeur, Universit´ede La Rochelle Hugues Berry, CR HDR, INRIA Rh^one-Alpes Examinateurs : Beno^ıtGirard, CR HDR, CNRS ISIR Yann Boniface, MCF, Universit´ede Lorraine Directeur de th`ese: Bernard Girau, Professeur, Universit´ede Lorraine Laboratoire Lorrain de Recherche en Informatique et ses Applications | UMR 7503 Mis en page avec la classe thloria. i À tous ceux qui ont trouvé ou trouveront un intérêt à lire ce manuscrit ou à me côtoyer. ii iii Remerciements Je tiens à remercier l’ensemble de mes rapporteurs, Arnaud Revel et Hugues Berry, pour avoir accepter de prendre le temps d’évaluer mon travail et de l’avoir commenté de manière constructive malgré les délais relativement serrés.
    [Show full text]
  • Enhancing Nervous System Recovery Through Neurobiologics, Neural Interface Training, and Neurorehabilitation
    UCLA UCLA Previously Published Works Title Enhancing Nervous System Recovery through Neurobiologics, Neural Interface Training, and Neurorehabilitation. Permalink https://escholarship.org/uc/item/0rb4m424 Authors Krucoff, Max O Rahimpour, Shervin Slutzky, Marc W et al. Publication Date 2016 DOI 10.3389/fnins.2016.00584 Peer reviewed eScholarship.org Powered by the California Digital Library University of California REVIEW published: 27 December 2016 doi: 10.3389/fnins.2016.00584 Enhancing Nervous System Recovery through Neurobiologics, Neural Interface Training, and Neurorehabilitation Max O. Krucoff 1*, Shervin Rahimpour 1, Marc W. Slutzky 2, 3, V. Reggie Edgerton 4 and Dennis A. Turner 1, 5, 6 1 Department of Neurosurgery, Duke University Medical Center, Durham, NC, USA, 2 Department of Physiology, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA, 3 Department of Neurology, Feinberg School of Medicine, Northwestern University, Chicago, IL, USA, 4 Department of Integrative Biology and Physiology, University of California, Los Angeles, Los Angeles, CA, USA, 5 Department of Neurobiology, Duke University Medical Center, Durham, NC, USA, 6 Research and Surgery Services, Durham Veterans Affairs Medical Center, Durham, NC, USA After an initial period of recovery, human neurological injury has long been thought to be static. In order to improve quality of life for those suffering from stroke, spinal cord injury, or traumatic brain injury, researchers have been working to restore the nervous system and reduce neurological deficits through a number of mechanisms. For example, Edited by: neurobiologists have been identifying and manipulating components of the intra- and Ioan Opris, University of Miami, USA extracellular milieu to alter the regenerative potential of neurons, neuro-engineers have Reviewed by: been producing brain-machine and neural interfaces that circumvent lesions to restore Yoshio Sakurai, functionality, and neurorehabilitation experts have been developing new ways to revitalize Doshisha University, Japan Brian R.
    [Show full text]
  • Conceptual Nervous System”: Can Computational Cognitive Neuroscience Transform Learning Theory?
    Beyond the “Conceptual Nervous System”: Can Computational Cognitive Neuroscience Transform Learning Theory? Fabian A. Sotoa,∗ aDepartment of Psychology, Florida International University, 11200 SW 8th St, AHC4 460, Miami, FL 33199 Abstract In the last century, learning theory has been dominated by an approach assuming that associations between hypothetical representational nodes can support the acquisition of knowledge about the environment. The similarities between this approach and connectionism did not go unnoticed to learning theorists, with many of them explicitly adopting a neural network approach in the modeling of learning phenomena. Skinner famously criticized such use of hypothetical neural structures for the explanation of behavior (the “Conceptual Nervous System”), and one aspect of his criticism has proven to be correct: theory underdetermination is a pervasive problem in cognitive modeling in general, and in associationist and connectionist models in particular. That is, models implementing two very different cognitive processes often make the exact same behavioral predictions, meaning that important theoretical questions posed by contrasting the two models remain unanswered. We show through several examples that theory underdetermination is common in the learning theory literature, affecting the solvability of some of the most important theoretical problems that have been posed in the last decades. Computational cognitive neuroscience (CCN) offers a solution to this problem, by including neurobiological constraints in computational models of behavior and cognition. Rather than simply being inspired by neural computation, CCN models are built to reflect as much as possible about the actual neural structures thought to underlie a particular behavior. They go beyond the “Conceptual Nervous System” and offer a true integration of behavioral and neural levels of analysis.
    [Show full text]
  • LTP, STP, and Scaling: Electrophysiological, Biochemical, and Structural Mechanisms
    This position paper has not been peer reviewed or edited. It will be finalized, reviewed and edited after the Royal Society meeting on ‘Integrating Hebbian and homeostatic plasticity’ (April 2016). LTP, STP, and scaling: electrophysiological, biochemical, and structural mechanisms John Lisman, Dept. Biology, Brandeis University, Waltham Ma. [email protected] ABSTRACT: Synapses are complex because they perform multiple functions, including at least six mechanistically different forms of plasticity (STP, early LTP, late LTP, LTD, distance‐dependent scaling, and homeostatic scaling). The ultimate goal of neuroscience is to provide an electrophysiologically, biochemically, and structurally specific explanation of the underlying mechanisms. This review summarizes the still limited progress towards this goal. Several areas of particular progress will be highlighted: 1) STP, a Hebbian process that requires small amounts of synaptic input, appears to make strong contributions to some forms of working memory. 2) The rules for LTP induction in the stratum radiatum of the hippocampus have been clarified: induction does not depend obligatorily on backpropagating Na spikes but, rather, on dendritic branch‐specific NMDA spikes. Thus, computational models based on STDP need to be modified. 3) Late LTP, a process that requires a dopamine signal (neoHebbian), is mediated by trans‐ synaptic growth of the synapse, a growth that occurs about an hour after LTP induction. 4) There is no firm evidence for cell‐autonomous homeostatic synaptic scaling; rather, homeostasis is likely to depend on a) cell‐autonomous processes that are not scaling, b) synaptic scaling that is not cell autonomous but instead depends on population activity, or c) metaplasticity processes that change the propensity of LTP vs LTD.
    [Show full text]
  • Simplified Calcium Signaling Cascade for Synaptic Plasticity Abstract
    Simplified calcium signaling cascade for synaptic plasticity Vladimir Kornijcuka, Dohun Kimb, Guhyun Kimb, Doo Seok Jeonga* cDivision of Materials Science and Engineering, Hanyang University, Wangsimni-ro 222, Seongdong- gu, 04763 Seoul, Republic of Korea bSchool of Materials Science and Engineering, Seoul National University, Gwanak-no 1, Gwanak-gu, 08826 Seoul, Republic of Korea *Corresponding author (D.S.J) E-mail: [email protected] Abstract We propose a model for synaptic plasticity based on a calcium signaling cascade. The model simplifies the full signaling pathways from a calcium influx to the phosphorylation (potentiation) and dephosphorylation (depression) of glutamate receptors that are gated by fictive C1 and C2 catalysts, respectively. This model is based on tangible chemical reactions, including fictive catalysts, for long- term plasticity rather than the conceptual theories commonplace in various models, such as preset thresholds of calcium concentration. Our simplified model successfully reproduced the experimental synaptic plasticity induced by different protocols such as (i) a synchronous pairing protocol and (ii) correlated presynaptic and postsynaptic action potentials (APs). Further, the ocular dominance plasticity (or the experimental verification of the celebrated Bienenstock—Cooper—Munro theory) was reproduced by two model synapses that compete by means of back-propagating APs (bAPs). The key to this competition is synapse-specific bAPs with reference to bAP-boosting on the physiological grounds. Keywords: synaptic plasticity, calcium signaling cascade, back-propagating action potential boost, synaptic competition 1 1. Introduction Synaptic plasticity is the base of learning and endows the brain with high-level functionalities, such as perception, cognition, and memory.(Hebb, 1949; Kandel, 1978) The importance of synaptic plasticity has fueled vigorous investigations that have effectively revealed its nature from different perspectives over different subjects.
    [Show full text]
  • Figure 3.2 the Effect of Theta Burst Stimulation of the Medial Geniculate Nucleus on the Amplitude
    PLASTICITY OF THE RAT THALAMOCORTICAL AUDITORY SYSTEM DURING DEVELOPMENT AND FOLLOWING WHITE NOISE EXPOSURE by Jennifer Lauren Hogsden Robinson A thesis submitted to the Centre for Neuroscience Studies In conformity with the requirements for the degree of Doctor of Philosophy Queen‟s University Kingston, Ontario, Canada (January, 2011) Copyright ©Jennifer Lauren Hogsden Robinson, 2011 Abstract Synaptic plasticity reflects the capacity of synapses to undergo changes in synaptic strength and connectivity, and is highly regulated by age and sensory experience. This thesis focuses on the characterization of synaptic plasticity in the primary auditory cortex (A1) of rats throughout development and following sensory deprivation. Initial experiments revealed an age-dependent decline in plasticity, as indicated by reductions in long-term potentiation (LTP). The enhanced plasticity of juvenile rats appeared to be mediated by NR2B subunits of the N-methyl-d-aspartate receptor (NMDAR), as NR2B antagonist application reduced LTP to adult-like levels in juveniles, yet had no effect in adults. The importance of sensory experience in mediating plasticity was revealed in experiments using white noise exposure, which is a sensory deprivation technique known to arrest cortical development in A1. Notably, adult rats reared in continuous white noise maintained more juvenile-like levels of LTP, which normalized upon subsequent exposure to an unaltered acoustic environment. The white noise-induced LTP enhancements also appeared to be mediated by NR2B subunits, as NR2B antagonists reversed these LTP enhancements in white noise-reared rats. Given the strong influence that sensory experience exerts on plasticity, additional experiments examined the effect of shorter episodes of white noise exposure on LTP in adult rats.
    [Show full text]
  • VILNIUS UNIVERSITY Dalius Krunglevičius STDP LEARNING of SPATIAL and SPATIOTEMPORAL PATTERNS Doctoral Dissertation Physical
    VILNIUS UNIVERSITY Dalius Krunglevičius STDP LEARNING OF SPATIAL AND SPATIOTEMPORAL PATTERNS Doctoral Dissertation Physical Sciences, Informatics (09P) Vilnius, 2016 Dissertation work was carried out at the Faculty of Mathematics and Informatics of Vilnius University from 2011 to 2015. Scientific Supervisor Prof. Dr. habil. Šarūnas Raudys (Vilnius University, Physical Sciences, Informatics - 09P) VILNIAUS UNIVERSITETAS Dalius Krunglevičius STDP MOKYMO TAIKYMAS ERDVINĖMS BEI ERDVINĖMS- LAIKINĖMS STRUKTŪROMS ATPAŽINTI Daktaro disertacija Fiziniai mokslai, informatika (09P) Vilnius, 2016 Disertacija rengta 2011-2015 metais Vilniaus universiteto Matematikos ir informatikos fakultete. Mokslinis vadovas prof. habil. dr. Šarūnas Raudys (Vilniaus universitetas, fiziniai mokslai, informatika - 09P) Acknowledgements I’m sincerely grateful to professor Šarunas Raudys for his support, help and belief in the significance of my work. I would like to thank professor Adam Czajka for providing me with data-sets for my experiments and Geoff Vasil for helping with the editing. And my deepest gratitude goes to my wife, Gražina, without whose patience and support this dissertation would have never been written. 5 Table of Contents Notation .............................................................................................................. 9 Introduction ....................................................................................................... 10 Motivation and the Field of Research ..........................................................
    [Show full text]
  • UC San Diego UC San Diego Electronic Theses and Dissertations
    UC San Diego UC San Diego Electronic Theses and Dissertations Title Towards more biologically plausible computational models of the cerebellum with emphasis on the molecular layer interneurons Permalink https://escholarship.org/uc/item/9n45f5rz Author Lennon, William Charles Publication Date 2015 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California UNIVERSITY OF CALIFORNIA, SAN DIEGO Towards more biologically plausible computational models of the cerebellum with emphasis on the molecular layer interneurons A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Electrical Engineering (Intelligent Systems, Robotics and Control) by William Charles Lennon, Jr. Committee in charge: Professor Robert Hecht-Nielsen, Chair Professor Clark Guest, Co-Chair Professor Jeffrey Elman Professor Lawrence Larson Professor Laurence Milstein Professor Douglas Nitz 2015 Copyright William Charles Lennon, Jr., 2015 All rights reserved. SIGNATURE PAGE The Dissertation of William Charles Lennon, Jr. is approved, and it is acceptable in quality and form for publication on microfilm and electronically: Co-Chair Chair University of California, San Diego 2015 iii DEDICATION To my loving wife, Katie, and my parents, Bill and Cyndy, for shaping me. iv EPIGRAPH "The eye sees only what the mind is prepared to comprehend" -- Robertson Davies v TABLE OF CONTENTS SIGNATURE PAGE ......................................................................................
    [Show full text]
  • Learning in Large Scale Spiking Neural Networks
    Learning in large-scale spiking neural networks by Trevor Bekolay A thesis presented to the University of Waterloo in fulfillment of the thesis requirement for the degree of Master of Mathematics in Computer Science Waterloo, Ontario, Canada, 2011 c Trevor Bekolay 2011 I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I understand that my thesis may be made electronically available to the public. ii Abstract Learning is central to the exploration of intelligence. Psychology and machine learning provide high-level explanations of how rational agents learn. Neuroscience provides low- level descriptions of how the brain changes as a result of learning. This thesis attempts to bridge the gap between these two levels of description by solving problems using ma- chine learning ideas, implemented in biologically plausible spiking neural networks with experimentally supported learning rules. We present three novel neural models that contribute to the understanding of how the brain might solve the three main problems posed by machine learning: supervised learning, in which the rational agent has a fine-grained feedback signal, reinforcement learning, in which the agent gets sparse feedback, and unsupervised learning, in which the agents has no explicit environmental feedback. In supervised learning, we argue that previous models of supervised learning in spiking neural networks solve a problem that is less general than the supervised learning problem posed by machine learning. We use an existing learning rule to solve the general supervised learning problem with a spiking neural network.
    [Show full text]
  • Hebbian Plasticity Requires Compensatory Processes on Multiple Timescales
    Hebbian plasticity requires compensatory processes on multiple timescales 1; 2 Friedemann Zenke ∗and Wulfram Gerstner January 16, 2017 1) Dept. of Applied Physics, Stanford University, Stanford, CA 94305, USA orcid.org/0000-0003-1883-644X 2) Brain Mind Institute, School of Life Sciences and School of Computer and Communication Sciences, Ecole Polytechnique Fédérale de Lausanne, CH-1015 Lausanne EPFL, Switzerland orcid.org/0000-0002-4344-2189 This article is published as: Zenke, F., and Gerstner, W. (2017). Hebbian plasticity requires compensatory processes on multiple timescales. Phil. Trans. R. Soc. B 372, 20160259. DOI: 10.1098/rstb.2016.0259 http://rstb.royalsocietypublishing.org/content/372/1715/20160259 Abstract We review a body of theoretical and experimental research on Hebbian and homeostatic plasticity, starting from a puzzling observation: While homeostasis of synapses found in exper- iments is a slow compensatory process, most mathematical models of synaptic plasticity use rapid compensatory processes. Even worse, with the slow homeostatic plasticity reported in experiments, simulations of existing plasticity models cannot maintain network stability unless further control mechanisms are implemented. To solve this paradox, we suggest that in ad- dition to slow forms of homeostatic plasticity there are rapid compensatory processes which stabilize synaptic plasticity on short timescales. These rapid processes may include heterosy- naptic depression triggered by episodes of high postsynaptic firing rate. While slower forms of homeostatic plasticity are not sufficient to stabilize Hebbian plasticity, they are important for fine-tuning neural circuits. Taken together we suggest that learning and memory rely on an intricate interplay of diverse plasticity mechanisms on different timescales which jointly ensure stability and plasticity of neural circuits.
    [Show full text]
  • Mechanisms of Homeostatic Synaptic Plasticity in Vivo
    fncel-13-00520 November 29, 2019 Time: 19:17 # 1 MINI REVIEW published: 03 December 2019 doi: 10.3389/fncel.2019.00520 Mechanisms of Homeostatic Synaptic Plasticity in vivo Hey-Kyoung Lee1,2* and Alfredo Kirkwood1 1 Department of Neuroscience, Mind/Brain Institute, Johns Hopkins University, Baltimore, MD, United States, 2 Kavli Neuroscience Discovery Institute, Johns Hopkins University, Baltimore, MD, United States Synapses undergo rapid activity-dependent plasticity to store information, which when left uncompensated can lead to destabilization of neural function. It has been well documented that homeostatic changes, which operate at a slower time scale, are required to maintain stability of neural networks. While there are many mechanisms that can endow homeostatic control, sliding threshold and synaptic scaling are unique in that they operate by providing homeostatic control of synaptic strength. The former mechanism operates by adjusting the threshold for synaptic plasticity, while the latter mechanism directly alters the gain of synapses. Both modes of homeostatic synaptic plasticity have been studied across various preparations from reduced in vitro systems, such as neuronal cultures, to in vivo intact circuitry. While most of the cellular and molecular mechanisms of homeostatic synaptic plasticity have been worked out Edited by: using reduced preparations, there are unique challenges present in intact circuitry Mathieu Letellier, in vivo, which deserve further consideration. For example, in an intact circuit, neurons UMR5297 Institut Interdisciplinaire de Neurosciences (IINS), France receive distinct set of inputs across their dendritic tree which carry unique information. Reviewed by: Homeostatic synaptic plasticity in vivo needs to operate without compromising Tara Keck, processing of these distinct set of inputs to preserve information processing while University College London, United Kingdom maintaining network stability.
    [Show full text]
  • Incorporating Rapid Neocortical Learning of New Schema-Consistent Information Into Complementary Learning Systems Theory James L
    Journal of Experimental Psychology: General Incorporating Rapid Neocortical Learning of New Schema-Consistent Information Into Complementary Learning Systems Theory James L. McClelland Online First Publication, August 26, 2013. doi: 10.1037/a0033812 CITATION McClelland, J. L. (2013, August 26). Incorporating Rapid Neocortical Learning of New Schema-Consistent Information Into Complementary Learning Systems Theory. Journal of Experimental Psychology: General. Advance online publication. doi: 10.1037/a0033812 Journal of Experimental Psychology: General © 2013 American Psychological Association 2013, Vol. 142, No. 4, 000 0096-3445/13/$12.00 DOI: 10.1037/a0033812 THEORETICAL REVIEW Incorporating Rapid Neocortical Learning of New Schema-Consistent Information Into Complementary Learning Systems Theory James L. McClelland Stanford University The complementary learning systems theory of the roles of hippocampus and neocortex (McClelland, McNaughton, & O’Reilly, 1995) holds that the rapid integration of arbitrary new information into neocortical structures is avoided to prevent catastrophic interference with structured knowledge repre- sentations stored in synaptic connections among neocortical neurons. Recent studies (Tse et al., 2007, 2011) showed that neocortical circuits can rapidly acquire new associations that are consistent with prior knowledge. The findings challenge the complementary learning systems theory as previously presented. However, new simulations extending those reported in McClelland et al. (1995) show that new information that is consistent with knowledge previously acquired by a putatively cortexlike artificial neural network can be learned rapidly and without interfering with existing knowledge; it is when inconsistent new knowledge is acquired quickly that catastrophic interference ensues. Several important features of the findings of Tse et al. (2007, 2011) are captured in these simulations, indicating that the neural network model used in McClelland et al.
    [Show full text]