combra.cs.rutgers.edu

Brain-morphism: Astrocytes as Units

Constantine Michmizos Computational Lab – Rutgers University

6th Neuro Inspired Computational Elements Workshop – Intel Hillsboro, OR 2015

Computa- Neural tional Astrocytic Brain Networks

2013

Children MEG and Modeling with Autism CNS 1 m ComBra Lab‘s goal †mimic

Systems 10 cm ᴥ To understand biological intelligence and

translate our knowledge to artificial intelligence †restore Maps 1 cm

o by developing Networks 1 mm brain-morphic †understand Computa- Neural tional Astrocytic Brain Networks computational methods Neurons 100 μm

• that integrate† with the brain Synapses 1 μm from the macro (behavioral) to the micro (synaptic) scale Adapted by Molecules 1 Å Terry Sejnowski Computing for Grillner et al. Nature 2016 Brain Science Understanding the function of the brain

Brain Science for Computing Using computational principles of the brain for generic data analysis Alan Turing, 1948

• A fascinating prelude to today’s AI

• Proposed connectionist models that would today be called neural networks

The Neuron as a • Randomly connected networks of Basic Information Processing Unit artificial neurons McCulloch and Pitts (1943) • Training via reinforcing successful and useful links and cutting useless ones 1948 1956 • The proposed learning rule was inspired by the infant’s brain 50 70 90 10 1957 1996

pigeons playing ping-pong - Skinner 1950

50 70 90 10 since they both contain similar elements

input neurons first hidden layer

1959 2013

50 70 90 10 Capsule NNs decreased the error by a whopping 45%

A capsule is a subset of neurons within a layer that outputs: 1. an instantiation parameter: is an entity present within a limited domain? 2. a vector of pose parameters: the pose of the entity relative to a canonical version A capsule replaces max pooling

1956 Dec 2017

50 70 90 10 Computing for Brain Science Understanding the function of the brain Sejnowski et al. Putting big data to good use in neuroscience Nature Neuroscience 2014

Brain Science for Computing Using computational principles of the brain for generic data analysis

50 70 90 10

1932. Edgar Adrian, Nobel Prize 1971. David Cohen, MIT 2013. Motor neurons control a robotic Single-Neuron Recordings Magnetoencephalography arm for paraplegic patients (BrainGate)

Information = f (electrical activityn) eurons

1997. Deep Brain Stimulation for 2013. TMS applied to the motor alleviating Parkinson’s disease cortex induces hand movement Non-neuronal brain cells are electrically silent The other brain (glia cells) 10% neurons90% Non-neuronal brain cells are electrically silent

2000’s. Calcium Imaging allows studies of calcium signaling in hundreds of neurons and glial cells, within neuronal circuits . Santiago Ramón y Cajal’s drawing of an astrocyte

Ramón y Cajal S. Something about the physiological significance of neuroglia. Revista Trimestral Micrografía 1, 3–47, 1897 2015 2015

Astrocytes

A paradigm shift in Neuroscience Computational Astrocytes Astrocytes ᴥ Neuro-morphic Computing • Introducing astrocyte, A paradigm shift in Neuroscience a new computational unit into Neural Networks ᴥ Understanding astrocytes by calcium imaging ᴥ Learning by weight updating • Embed astrocytic • The third part of the synapse mechanisms into NANs introduces an orthogonal • Suggest functions for dimension to “neural” plasticity astrocytes at the network level and large time scales ᴥ Time in discrete instances • where behavior and • Astrocytes respond dynamically diseases emerge and in larger time scales than neurons, mapping neural activity into the slower behavioral scales Microglia Astrocyte Presynaptic Neuron Astrocyte Blood Vessel

Colangelo et al. Neuroscience Letters 2014 Postsynaptic Neuron The Neural Modeling Paradigm Physical Phenomenon (e.g., neural firing)

v(t) I(t) treat as emergent phenomenon treat as target function • Byophysics-based models • Phenomenological models • Understand the principles of • Occam’s razor/KISS principle the underlying phenomena • Level of detail is • Data drives details hypothesis/interest driven Typically higher model complexity Typically lower model complexity

Adapted by Karlheinz Meier Neural Network Astrocytic Network Tripartite Synapse

Neurons encode information in spike rate Neural Network To capture the dynamics of the interaction between the neuronal and the astrocytic component of the networks

Step 1. Input as Spike timing

푉푗 푡 = 훿 푡 − 푡푖 푖 Step 2. Synaptic current response Mapping function between input-output 푡−푡 푡−푡 − 𝑖 − 𝑖 푤푗 푒 휏 − 푒 4휏 , 푖푓 푡 > 푡푖 퐼푗 푡, 푡푖 = Leaky Integrate-and-Fire (LIF) Neuron 0, 푖푓 푡 ≤ 푡푖 푑푣 Step 3. Integration of synaptic current 휏 = 푣 − 푣 푡 + 푅퐼(푡) 푑푡 푟푒푠푒푡 LIF 퐼 푡 = 퐼푗(푡) 푗 푖푓 푣(푡) ≥ 푉th Step 4. Output spike 푡ℎ푒푛 푣 푡 + 1 ← 푣푟푒푠푒푡 푉표 푡 = LIF 퐼(푡) Neural Network Astrocytic Network Tripartite Synapse

푖 푎푖 = 푎0 ∗ 푛 ∗ exp − 휏푑

2+ Inter-cellular Ca wave propagation in space and time 푐푖 푟푖, 푡 = 푎푖 ∙ 푔푖 푟푖, 푡

푟푖 − 휇 ∙ 푣 ∙ 푡 푟푖 − 휇 ∙ 푣 ∙ 푡 푔푖 푟푖, 푡 = 푒푥푝 − − 푒푥푝 − 휏푑푒푐푎푦 휏푟푖푠푒

the distance of astrocyte 푖 from the origination site (푟0 ≡ 0) intracellular Ca2+ level in astrocyte 푖 at time t

nM 2+ • 휏푑푒푐푎푦, 휏푟푖푠푒 control the fall and rise time of the Ca wave • 휏푑 controls the magnitude of the amplitude fall-off between astrocytes • µ captures the permeability of the cells through which the wave propagates s with speed 푣 - lumping together all the sub-mechanisms of gap-junctional and extracellular communication • n is the normalization constant for biexponential function, which is a function μm of the rise and decay parameter Neural Network Astrocytic Network Tripartite Synapse

Isyn Neural Network Astrocytic Network Tripartite Synapse 푑푥 푧 = − 1 − 푓 ∙ 푢 ∙ 푥 ∙ 훿 푡 − 푡푠푝 푑푡 휏푟푒푐 푑푦 푦 = − + 1 − 푓 ∙ 푢 ∙ 푥 ∙ 훿 푡 − 푡푠푝 푑푡 휏푖푛 푑푧 푦 푧 = − 푑푡 휏푖푛 휏푟푒푐 synaptic synaptic M. Tsodyks, A. Uziel, and H. Markram Synchrony generation in recurrent networks with current strength frequency-dependent synapses J. Neurosci. 2000

Isyn = A ∙ y(t) 푥 + 푦 + 푧 = 1 are the fractions of synaptic 휏푖푛 is the characteristic time of postsynaptic currents (PSCs) decay resources in a recovered, active, inactive state 푢 is the fraction of 푥 released when a spike 휏푟푒푐 is the recovery time from synaptic depression arrives at the synapse at time 푡푠푝 Neural Network Astrocytic Network Tripartite Synapse 푑푥 푧 = − 1 − 푓 ∙ 푢 ∙ 푥 ∙ 훿 푡 − 푡푠푝 푑푡 휏푟푒푐 푑푦 푦 = − + 1 − 푓 ∙ 푢 ∙ 푥 ∙ 훿 푡 − 푡 I 푠푝 Isynsyn 푑푡 휏푖푛 푑푧 푦 푧 = − 푑푡 휏푖푛 휏푟푒푐 Isyn = A ∙ y(t) 푑푓 푓 = − + 1 − 푓 ∙ 휅 ∙ 훩 푐푖 푟푖, 푡 − 푐푡ℎ푟푒푠ℎ 푑푡 휏퐶푎 휏퐶푎 is the decay constant of 푓 푓 is the gating variable 휅 controls the rise time of 푓 It models Ca2+-dependent presynaptic inhibition 푥 + 푦 + 푧 = 1 are the fractions of synaptic 휏푖푛 is the characteristic time of postsynaptic currents (PSCs) decay resources in a recovered, active, inactive state 훩(푥) Heaviside function

2+ 푢 is the fraction of 푥 released when a spike 휏푟푒푐 is the recovery time from 푐푡ℎ푟푒푠ℎ is [intracellular Ca ] needed synaptic depression to activate 푓 arrives at the synapse at time 푡푠푝 Neural Network Astrocytic Network Tripartite Synapse Astrocytes inject into the post-synaptic neuron a slow-injected current (SICs) I SICs Isynsyn • are well fit to bi-exponential distributions • have a rapid rise time (on the order of tens of ms)

Upon reaching a Ca2+ peak, an • have a comparatively larger decay time (on the order of hundreds of ms) astrocyte releases glio- • are correlated with Ca2+ wave peaks in both time and amplitude transmitters that lead to 휇퐴 NMDAR-mediated SICs, with I = 2.11 ∙ ∙ 푙푛 푤 ∙ 훩 푙푛 푤 the SIC amplitude being astro 푐푚2 logarithmically proportional to 2+ the Ca wave amplitude. 푤 = 푐푖 푟푖, 푡 푛 푀 − 196.69 푡 푡 ISIC 푡 = Iastro 푐푖 푟푖, 푡 = cpeak 푒푥푝 − 푆퐼퐶 − 푒푥푝 − 푆퐼퐶 휏푑푒푐푎푦 휏푟푖푠푒 Neural Network Astrocytic Network Tripartite Synapse 푑푥 푧 = − 1 − 푓 ∙ 푢 ∙ 푥 ∙ 훿 푡 − 푡푠푝 푑푡 휏푟푒푐 푑푦 푦 = − + 1 − 푓 ∙ 푢 ∙ 푥 ∙ 훿 푡 − 푡푠푝 푑푡 휏푖푛 IIsyn syn 푑푧 푦 푧 = − 푑푡 휏푖푛 휏푟푒푐 I = A ∙ y(t) 푑푓 푓 syn = − + 1 − 푓 ∙ 휅 ∙ 훩 푐푖 푟푖, 푡 − 푐푡ℎ푟푒푠ℎ 푑푡 휏퐶푎 휇퐴 I = 2.11 ∙ ∙ 푙푛 푤 ∙ 훩 푙푛 푤 astro 푐푚2

푤 = 푐푖 푟푖, 푡 푛 푀 − 196.69 푡 푡 ISIC 푡 = Iastro 푐푖 푟푖, 푡 = cpeak 푒푥푝 − 푆퐼퐶 − 푒푥푝 − 푆퐼퐶 휏푑푒푐푎푦 휏푟푖푠푒 Putting a Neural-Astrocytic Network to Probabilistic neural connections, with a Gaussian fall-off

m 2 2 푃μ 푟 푖푗 = 푃푚푎푥푒푥푝 − 푟푖푗 2휎 • 푟푖푗 the distance between neurons 푖 and 푗

• 푃푚푎푥 and 휎 are the peak and width of the probabilityμm distribution • No function – just basal Oscillations emerge as a network property neural activity

Local Sleep in Awake Rats, Nature 2011 Bryant et al. Nature Reviews, 2004 "This hypothesis was reinforced by a recent modelling study showing that intercellular Ca2+ signaling potentially can introduce slow oscillation in neurons [our Ref]. Our experimental data strongly supports this hypothesis by demonstrating that increasing astrocytic influence on neurons indeed drives them to join the oscillatory activity (Fig. 5C). In this context it is also important to note, that the ratio of astrocytes involved in the SWA was found to start decreasing right after virtually all neurons joined the simultaneous activity (Figs 4 and 5). This observation further supports the view that astrocytic activity corresponds to the generation or maintenance, rather than termination of SWA." brain oscillations are still lacking a mechanistic origin

• Rhythmic or repetitive electrical activity generated spontaneouslyPublications on or "Oscillations" in response per to Year stimuli by the neurons 2500 • Studied for 100 years (EEG, MEG, fMRI, PET, MER) 2000 • Encode and process brain information flow 1500 • Communication through coherence: interregional communication

# Papers # 1000 is established when the oscillatory activity between neuronal 500 pools is coherent (Fries 2005; 2009)

0• Related to all aspects of human behavior 1889 1909 1929 1949 1969 1989 2009 • cognitive, sensory, and motor tasks Year • All brain diseases are associated with oscillatory imbalances Source: PubMed Introducing function into a Neural-Astrocytic Network NAN Hopfield Nets • The idea of as energy minima was proposed by

I.A. Richards in 1924 in “Principles of Literary Criticism”

• Hopfield (1982) proposed that memories could be energy

minima of a neural net With 19,000 citations, Hopfield nets • Using energy minima to represent memories gives a are the precursors of Boltzman Machines (BM), content-addressable memory Restricted BM and Deep Belief Networks • Memories are stored in the synapses • A Hopfield net is composed of binary threshold units with recurrent connections between them • Recurrent nets of non-linear units are generally hard to analyze. They can: Settle to a stable state | Oscillate | Follow chaotic trajectories that cannot be predicted far into the features

ON ON ON

OFF ON OFF

OFF ON OFF

OFF ON OFF

Fully connected (not shown) • Hopfield (and others) realized that if the connections are symmetric, there is a global energy function • Each binary configuration of the whole network has an energy • The binary threshold decision rule causes the network to settle to a minimum of this energy function Goal: To generate a sequence of patterns • by going through a predetermined (by the astrocytes) sequence, in a closed limit cycle • Sequences of patterns occur widely in biological sequences (e.g., walking, learning a task, rehab) Network Neuron

N neurons with values 푥푖 = ±1, 푖 = 1, … , 푁aligns with the local field, ℎ푖: Fully connected neurons, i.e., every neuron푠 푖 푡 + 1 = 푠푛푔 ℎ푖 푡 is connected to every other neuron 푛푒푢푟푎푙 푎푠푡푟표 ℎ푖 푡 = ℎ푖 푡 + ℎ푖 푡 Connectivity weight 푤푖푗 = 푤푗푖, 푤푖푖 = 0 푁 1 휇 휇 퐽 = 푚 2ξ − 1 2ξ − 1 , 푖 ≠ 푗 푛푒푢푟푎푙 푖푗 푁 휇 i j ℎ푖 푡 = 퐽푖푗푠푗 푡 푗=1 Memories stabilizing symmetric matrix 휆 푞 휇+1 휇 (short-time connections) 푇푖푗 = 휇 2ξi − 1 2ξj − 1 , 푖 ≠ 푗 푁 푁 푎푠푡푟표 푁 휉 = 푥1, … , 푥푖, … , 푥푁 ∈ −1,1 ℎ푖 푡 = 푗=1 푇푖푗푆퐶푗 푡 푡−훿푐푎푙 휏푆퐶 푝 matrix of푆퐶 amplitudes= 푒 for the SICs m memories 휉 , 푝 = 1, … , 푚 (long-time connections) Astrocytic Dynamics Astrocytic Learning

time

The dynamics of the two signals of interest in an astrocytic process: Local Ca2+ wave which rises in response to the presynaptic activity and the related SIC injected into the postsynaptic neuron.

2+ δcal the time at which the astrocyte Ca reached cthresh Astrocytic Dynamics Astrocytic Learning Derive time spent in each memory τ

Define the linear operator 퐿 :

• 퐿 푃푡 ≡ 푃푡−1 2 1 Taylor series of • 퐿 푃푡 ≡ 푃푡−2 1−푥 We can solve for 푃 : 푡 ∞ ∞ 훽푠푡 푡′ 푡′ 푃 = = 훽 훼퐿 푠 = 훽 훼 푠 ′ 푡 1 − 훼퐿 푡 푡−푡 푡′=0 푡′=0 푐 휏 For 푃 = 푐 in the continuous limit: 푡ℎ푟푒푠ℎ = 훼휏−푡′푑푡′ 푡 푡ℎ푟푒푠ℎ 훽 0 푐 ln( 푡ℎ푟푒푠ℎ ln 훼 + 1) 훽 with the general solution 휏 = ln 훼 Astrocytic Dynamics Astrocytic Learning Derive time spent in each memory τ

휏 1 훽훼푡푑푡 = 1 훽 = l n 푎 0

ln 1−푐 Biological consideration. The more the astrocyte depends 휏 = 푡ℎ푟푒푠ℎ on its own Ca2+ level in the previous time-step (α), the ln 훼 less it depends on the presynaptic neural activity (β). 푐 ln( 푡ℎ푟푒푠ℎ ln 훼 + 1) 훽 휏 = ln 훼 Astrocytic Dynamics Astrocytic Learning Derive time spent in each memory τ

ln 1−푐 휏 = 푡ℎ푟푒푠ℎ ln 훼

For a fixed α, increasing cthresh increases the time it takes for an astrocyte process

to release an SC, thus increases τq Intuition Astrocytic Dynamics Astrocytic Learning

• The NAN has an energy surface that has 휈 minima at each memory 휉푖 but which also tilts steadily while the system is in a particular 휈 state, so that a downhill move from 휉푖 to 휈+1 휉푖 occurs eventually • But this picture is a little deceptive, because the dynamics cannot be represented simply as descent on an energy landscape, unless the connections are symmetric The overlap of the neural network state with the stored memories. N = 500, p=7, q=6 Astrocytic Dynamics Astrocytic Learning Astrocytic Learning of Hebbian-type

Learning rate Time btw. switching memories

휇+1 휇 Δ푇푖푗 = 휂푠푖 푡푠푤푖푡푐ℎ 푃푗 푡푠푤푖푡푐ℎ = 휂푠푖 푡푠푤푖푡푐ℎ 푠푗 0 = 휂휉푖 휉푗

Postsynaptic state Astrocytic state = Presynaptic State

for 푡푠푤푖푡푐ℎ ≫ 0 • At t = tswitch, the astrocyte correlates its current state with the state of the post-synaptic neuron and adjusts the levels of future gliotransmitter release accordingly—changing future SIC release • This mechanism requires retrograde signaling between the post-synaptic neuron and astrocyte process, known to occur through endocannabinoid mediated pathways (Fellin et al. Neuron 2004) What did we discover? Kevin T. Feigelis, Major, 2016 Leo Kozachkov, Physics/Math Major, 2016

Now a PhD student at Stanford with Dan Yamins Now a PhD student at MIT with Earl K. Miller

NeuroAI Lab Brain & Cognitive Sciences Department Giannis Vladimir Polykretis Ivanov

How astrocytes may encode information in Calcium waves Towards Mesochronous* Communication

* Thanks to Simon Knowles Astrocytes generate Astrocytes generate brain oscillations sequence learning Neural – Astrocytic Networks

Model Artificial Biological 1 sec Data Intelligence Intelligence and NANs and NANs

Local sleep in awake rats Nature 2011 NAN Applications Neuro-mimetic Robotics Neuro-morphic Computing thank You combra.cs.rutgers.edu

Brain-morphism: Astrocytes as Memory Units

Constantine Michmizos Computational Brain Lab – Rutgers University

6th Neuro Inspired Computational Elements Workshop – Intel Hillsboro, OR