<<

Available online at www.sciencedirect.com

ScienceDirect

The role of adaptation in

1,3 2,3

Alison I Weber and Adrienne L Fairhall

The concept of ‘neural coding’ supposes that neural firing can identify aspects of high-dimensional time-varying neural

patterns in some sense represent some external correlate, dynamics—such as the instantaneous firing rate of aneuron or

whether sensory, motor, or structural knowledge about the set of —and map them onto inputs (or motor outputs).

world. While the implied existence of a one-to-one mapping The power of this approach is that it can tell us which

between external referents and neural firing has been useful, particular properties of the inputs or outputs are explicitly

the prevalence of adaptation challenges this. Adaptation represented in neural activity. It can further allow one to

provides neural responses with dynamics on timescales that evaluate how the relationship between the external world and

range from milliseconds up to many seconds. These timescales activity changes, for example, as a function of learning. If a

are highly relevant for sensory experience in the natural world, or population exhibits a response to a particular

in which local statistical properties of inputs change , that response is said to adapt if after time or upon

continuously, and are additionally altered by active sensing. subsequent presentations, the response decreases. Here, we

Adaptation has a number of consequences for coding: it will use a broader definition of the term ‘adaptation’, to

creates short-term history dependence; it engenders complex describe a wide variety of phenomena whichare characterized

feature selectivity that is time-varying; and it can serve to by history-dependence or context-dependence in neural

enhance information representation in dynamic environments. response properties [2,3,4 ]; most generally, it is used to

Considering how to best incorporate adaptation into neural refer to a change inthe entire suite of steady-state responses of

models exposes a fundamental dichotomy in approaches to a neuron after a change in stimulus or context. Thus,

the description of neural systems: ones that take an explicitly adaptation is defined within the neural coding concept but is

‘coding’ perspective versus ones that describe the system’s also a sign of complexity of the coding metaphor, implying

dynamics. Here we discuss the pros and cons of different that ‘the neural code’ depends on history and context.

approaches to the modeling of adaptive dynamics.

Neurons and neural circuits are dynamic on a continuum of

Addresses

timescales, undergoing continuous changes in cellular prop-

1

Graduate Program in , University of Washington,

erties such as excitability due to the dynamics of ion channels

United States

2 and synaptic properties such as vesicle availability with time-

Department of Physiology and Biophysics and Computational

Neuroscience Center, University of Washington, United States scales of history dependence ranging from milliseconds to

3

UW Institute for Neuroengineering, United States seconds [5,6]. Response properties evaluated at the single

neuron level are further influenced by network level dynam-

Corresponding author: Fairhall, Adrienne L ([email protected])

ics. Typically, experiments exploring adaptation have probed

the system in two or more different and distinct states. After

Current Opinion in Neurobiology 2019, 58:135–140 abruptly changing the value of a stimulus, or the statistical

ensemble from which stimulus samples are drawn, one can

This review comes from a themed issue on Computational

neuroscience track the response of the system as the system relaxes into its

new state in order to characterize how firing rates change; or fit

Edited by Brent Doi ron and Ma´ te´ Lengyel

models for the coding properties of the system in its new

For a complete overview see the Issue and the Editorial

‘adapted’ state, Figure 1. Despite this typical experimental

Available online 27th September 2019

design, however, natural environments are continuously

https://doi.org/10.1016/j.conb.2019.09.013 changing, and the underlying neural dynamics responsible

0959-4388/ã 2019 Elsevier Ltd. All rights reserved. for both temporal relaxation and steady-state characteriza-

tions of adaptation are always in play as the nervous system is

continually driven by ongoing changes in the environment.

In of these ongoing processes, what does it mean for

a neuron to adapt? How might one think about adaptation

The notion of a neural code pervades neuroscience. In its in the absence of a natural separation of timescales

most direct interpretation, the firing of a neuron or a neuronal between ‘response’ and ‘adaptation’? How does our

population is interpreted as signaling the presence and understanding of adaptation depend on the models we

strength of an external stimulus or a motor output. It is of employ? As the field moves beyond single neuron record-

course evident that the idea of coding is an approximation [1 ]. ings and isolated stimulus presentations to situations, in

A neural system is a highly interconnected nonlinear dynam- which population recordings are made during natural

ical system that receives and computes upon multiple com- stimulation and behavior, it is worth revisiting adaptation

plex inputs. The coding paradigm rests on the idea that one in view of emerging model characterizations.

www.sciencedirect.com Current Opinion in Neurobiology 2019, 58:135–140

136 Computational neuroscience

Figure 1

This can serve to allow a relatively larger response at the

onset of a novel stimulus. Adaptation also refers to a

linear-nonlinear models change in the steady-state responses properties of a

neuron or network following a change in the stimulus

ensemble. For example, the spike-triggering features or

tuning curve of a neuron might change after a change in

stimulus statistics or context. In response to a contrast

change, early sensory areas have been shown to adjust

both the temporal feature that drives responses and the

generalized linear models associated nonlinear input-output function (Figure 1).

This adjustment often occurs such that the dynamic range

of the responses is matched to the range of inputs, a

phenomenon known as gain scaling [3], regarded as a

signature of efficient coding This canonical computation

of the nervous system occurs throughout early [4 ,7,8]

and higher order [9] sensory areas, as well as in circuits

that perform multisensory integration [10], motor control

[11], and economic decision making [12].

subunit models

The multiplicity of cellular dynamical timescales leads to

multiple timescales of firing rate adaptation [13,14]. A

signature of these multiple timescales is that adaptation

toa stepchangeinstimulusoftendoes notoccurwitha fixed

time constant, but with a time constant that scales with the

frequency of change in the stimulus. When driven with

continuously varying stimuli such as sine waves, these

responses have been found to be compactly described by

a rate responsetothestimulus that haspowerlaw properties

[14,15], or acts like fractional differentiation applied to the

input [14,16]. In the case of cortical neurons, this transfor-

deep neural networks

mation serves to decorrelate output, which may improve

coding efficiency of the system [15].

Using models to probe adaptation

How can one model adaptation? There are two main

Current Opinion in Neurobiology

classes of models in . One may try

to develop dynamical models of a system, approximate

Statistical models of neural responses.

dynamical equations for neural activity r(t) that may be

Linear-nonlinear cascade models consist of a linear filtering step

driven by external variables s(t), generically:

followed by a static nonlinear transformation. LN models are

commonly used to probe response properties of a system in two dr

different states and often reveal stimulus-dependent changes in both

¼ Fðr; sÞ ð1Þ

the linear filter and nonlinearity that best characterize responses. dt

Generalized linear models incorporate dependence on spike history

into a linear-nonlinear framework and can capture several adaptive

The function F represents a general nonlinear, possibly

properties. Subunit models consist of multiple converging LN

components and have successfully described response properties at multidimensional, set of dynamical equations, such as the

many stages of the visual system. Deep neural networks comprised Hodgkin-Huxley equations driven by a current input. Alter-

many layers of cascaded linear-nonlinear computations and are

natively, statistical models evaluate and express the correlative

defined by a large number of parameters. Deep neural networks have relationships between the firing of a neuron or population, r(t),

the ability to capture a wide array of observed phenomena, but often

at the current time t, and external variables stðtÞ in a time

lack interpretability in terms of mechanism.

window extending over a window of time t relative to t,

r ðtÞ ¼ f ðstðtÞÞ ð2Þ

What is adaptation and what roles might it

serve? for some functional form f.

Classically, adaptation refers to a gradual reduction in a

neuron’s firing rate following stimulus onset or following a While at some level equivalent, the first class of models

change in the stimulus, as in spike frequency adaptation. aims to explicitly capture dynamical mechanisms while the

Current Opinion in Neurobiology 2019, 58:135–140 www.sciencedirect.com

Adaptation in neural coding Weber and Fairhall 137

second is often thought of as defining ‘coding’. Examples of channel or synaptic dynamics, but rather an understand-

Eq. (2) include simple, phenomenological models such as ing of how the components of the LN model description

linear-nonlinear (LN) models [17 ]. Typically, fitting such arise from the underlying neural dynamics [29,30,31].

models under changing stimulus conditions reveals

changes in a neuron’s response properties (Figure 1) A key property of overt adaptation mechanisms is that

[18–21]. A general way to capture such adaptive effects response properties are modulated by the history of activity

is to extend Eq. (2) to incorporate parameters u describing of the system. A coding model framework that takes this

the stimulus-response mapping, such as the shape of a explicitly into account might therefore be expected to

filter or the gain of a tuning curve, which may themselves better fit the timecourse of adaptive responses. Recurrent

be time-varying: linear-nonlinear models, or generalized linear models

(GLMs), incorporate response history dependence into

an LN model framework (Figure 1) [17 ]. In the GLM,

r ðtÞ ¼ f ðstðtÞ; uT ðtÞÞ ð3Þ

the gain of the response to a stimulus feature is multiplica-

tively modulated by the history of firing via a ‘spike history

Here, the time variation of uT captures the adaptation of

filter’. This explicit history dependence allows GLMs to

the system: it is a function of the longer-term history of

straightforwardly capture simple adaptive phenomena, like

the stimulus, sTðtÞ, where T is much longer than t:

spike frequency adaptation, by accumulating negative

feedback with each subsequent spike [32].

uT ð tÞ ¼ gðsT ðtÞÞ ð4Þ

GLMs can also capture more complex adaptive phenom-

ena (Figure 2a). A GLM including selectivity to a single

Ideally a complete system characterization according to

stimulus feature and a single spike history filter can

Eqs. (3–4) would include an evaluation of how the model

demonstrate gain rescaling, such that all nonlinearities

parameters, u, depend on longer term stimulus properties.

overlay when normalized by stimulus contrast, as

Studies often identify slowly varying properties of stimuli

observed experimentally [30]. These apparent adjust-

such as mean and variance [22,3] or offset and envelope

ments in the nonlinearity occur without any correspond-

[23] as examples of relevant long timescale properties.

ing adjustment or explicit adaptation of the nonlinearity

For example, in the phenomenon of gain scaling dis-

in the GLM. Further, by allowing a sufficiently long

cussed above, the system’s gain curves exhibit a dynamic

history filter, GLMs are able to capture experimentally

range that is modulated by the stimulus’ time-varying

observed multiple-timescale dynamics. A GLM can

standard deviation s(t) [13]. In the fly neuron H1, at the

exhibit signatures of fractional differentiation, such as

same time, the mean slow-varying firing rate R behaves as

near-constant phase lead, to the same extent as a Hodg-

a fractional derivative of s(t), so in that case the full

kin-Huxley model with multiple afterhyperpolarization

response model can be written as [3]

currents with a range of timescales (Figure 2b).

r ðtÞ ¼ RðsðtÞÞf ðstðtÞ=sðtÞÞ ð5Þ

While GLMs capture a wide array of adaptive phenom-

ena, what insights, if any, do we gain into neural systems

This can be considered as a temporally multiplexed code by fitting GLMs to data? One might hope that this

[13,23,24], whereby detailed spike timing conveys infor- formulation could allow one to separate out stimulus-

mation about short timescale stimulus variations, while driven and history-driven effects and determine, for

the slow-varying rate conveys information about sðtÞ: example, a single characteristic spike-triggering stimulus

feature that is invariant under different stimulus condi-

Models and mechanisms tions, while the spike history filter might isolate activity-

It is important to note that a system that does not have any driven internal neural dynamics. However, generally this

explicit adaptive dynamics can demonstrate dependence is not the case. Upon a change in mean input, for example,

on stimulus conditions when viewed through the lens of while the presence of the spike history filter captures the

an LN model [25,26]. This effect has been demonstrated dynamics of the change in firing rate, in general both the

in model neurons without adaptive currents, including form of the optimal spike history filter and of the stimulus

leaky integrate-and-fire neurons [27] and simple conduc- feature depend on the stimulus condition [22,32]. Thus,

tance-based models [25], and in computational instantia- there is no clear interpretation of these model compo-

tions of the Reichardt correlator [28]. In these models, nents in terms of cellular mechanism.

fitted linear filters and nonlinearities show changes

between stimulus conditions despite no slow temporal Coding on multiple timescales

dynamics or parameter changes. Further, in some cases More complex models can provide a richer to fit

one can analytically predict the changes in filter time multiple-timescale response properties in experimental

constants by analysis of the underlying system [29]. Thus data. For example, recent work has shown that cortical

the ‘mechanism’ for these changes need not be specific neurons exhibit a surprisingly complex form of temporal

www.sciencedirect.com Current Opinion in Neurobiology 2019, 58:135–140

138 Computational neuroscience

Figure 2 (a) (b) 24 Hodgkin-Huxley with 3 AHP 10 10 GLM 20 16 1 1 12

phase lead (deg) 8 0.1 0.1 10

0.01 0.01 5 normalized response normalized response

0.001 0.001 gain (spikes/sec) 3

-2 -1 0 1 2 -1 0 1 2 3 1 2 4 8163264 stimulus scaled stimulus stimulus period (sec)

Current Opinion in Neurobiology

Simple models can capture adaptive phenomena.

(a) A GLM fit to responses of a Hodgkin-Huxley neuron that exhibits gain scaling, the adjustment of the dynamic range of the input/output curve

to the range of the stimulus, also shows gain scaling. Left: Input-output functions for the GLM, plotted as a function of stimulus value for four

different stimulus standard deviations (s). Right: Input-output functions plotted by normalized stimulus values overlay, indicating gain rescaling.

(b) A GLM fit to responses of a Hodgkin-Huxley neuron with three different afterhyperpolarization currents also shows signatures of fractional

differentiation, including near-constant phase lead across frequencies (top) and a power law relationship between gain and stimulus period

(bottom).

sequence sensitivity: they adapt to, apparently ‘binding’, efficient to fit models like Eq. (6) to inputs, in which

repeated lengthy sequences of images, and respond slow-varying statistical components have been extracted

strongly when the sequence is interrupted [33 ]. This and specified at lower temporal resolution.

behavior can be interpreted as a form of predictive coding,

whereby a neural system responds with high amplitude to

Network models

an input that violates a recently built expectation, here

One can extend the statistical modeling approach to

the missing arrival of an element of a sequence. These

arbitrarily complex models. Recently, McIntosh et al.

apparently sophisticated responses may also arise from

[36 ] used population data from simultaneously recorded

multiple-timescale adaptive dynamics. Latimer et al.

retinal ganglion cells to fit a multilayer convolutional

show that similar offset responses are observed in multi-

neural network (CNN), analogous to cascaded LN opera-

ple cortical areas [34] (like those observed in the retina

tions [35]. After fitting this model using natural movies,

[35]), and can be predicted from an extension of an LN

the model was able to predict responses to natural scenes

model that nonlinearly integrates contributions from

and, perhaps surprisingly, to generalize responses to other

temporal ‘subunits’. This model combines 3–4 stimulus

stimulus categories. As a feed-forward model, the CNN

filters that span a wide range of timescales—up to many

model framework of [36 ] lacks dynamics beyond the

seconds. In contrast to Eqs. (3–4), this model describes

400 ms temporal filtering of the initial layer and so does

the response, in a single step, as a function of the entire

not reproduce the full range of long-timescale adaptive

longer-term history of the stimulus, sT ðtÞ,

dynamics that the retina exhibits. However, the inclusion

of a long short-term memory (LSTM) unit, a standard and

powerful method for incorporating long-timescale mem-

r ðtÞ ¼ hðsT ðtÞÞ ð6Þ

ory in networks, allows the network to

capture firing rate adaptation following a change in

albeit where h is a multidimensional model that can stimulus contrast.

accommodate effects such as multiplicative modulation.

Such increasingly complex models are able to capture

This model describes coding of multiple stimulus time- complex response properties, but what, if anything, is

scales on the same footing; long-term stimulus properties learned from them about the system? Multilayer network

are not explicitly parsed out, as in Eq. (5). However, models of the retina can exhibit properties similar to

fitting such a model requires fitting to an increasingly high retinal neurons without explicitly training the network

dimensional stimulus as T increases. It may be more to include these properties. For example, a two-layer

Current Opinion in Neurobiology 2019, 58:135–140 www.sciencedirect.com

Adaptation in neural coding Weber and Fairhall 139

subunit model of the retina showed nodes in the first layer All of the characterizations of adaptation we have discussed

to be consistent with properties of bipolar cells, such as have primarily been applied to single neurons, but current

receptive field size and temporal structure [37], and a recording strategies enable a deeper consideration of the

CNN appears to self-organize into multiple cell types meaning and role of adaptation at the network level. Some

resembling those seen in the retina [38]. It is possible that studies examine how the cumulative effects of single neuron

this approach will eventually provide new and testable adaptation affect population level coding [47] or arise from

predictions about network structure. However, models circuit mechanisms [48,49,50 ], but there has been little

like CNNs are sufficiently large and complex that func- work to date on exploring adaptive population codes, where

tional interpretation of the network becomes difficult or coding elements are defined at a multineuronal level [51].

impossible without additional analysis. Even when a Understanding the properties of population codes in these

network model captures a system’s complex response key intermediate timescales, so critical for natural environ-

properties, it does not ‘reveal’ these computations: rather, ments, is an interesting avenue for future work.

the model must be probed, like the experimental system,

with stimuli designed to test for the presence of these Conflict of interest statement

properties. Nothing declared.

Dynamical models Acknowledgements

An alternative approach to the modeling of adaptation is

We thank Kenneth Latimer for useful discussions and for contributing results on

to construct dynamical models as in Eq. (1). Simplified the ability of GLMs to capture adaptive properties (Figure 2), and Ma´te´ Lengyel

for valuable comments on the manuscript. ALF was supported by the Washington

dynamical models can capture fundamental dynamical

Research FoundationUW Institute for Neuroengineering, the Human Frontiers in

mechanisms at the single neuron level [39]. Recurrent Science Program and the Simons Foundation’s Collaboration for the Global .

network models can produce adaptive dynamics without

incorporating explicit neuronal adaptation mechanisms

References and recommended reading

[40 ]. However, recent work suggests that artificial spik- Papers of particular interest, published within the period of review,

have been highlighted as:

ing neural networks can improve performance by incor-

porating neuronal adaptation, while at the same time of special interest

of outstanding interest

reducing the total number of spikes produced, improving

network efficiency [41].

1. Brette R: Is coding a relevant metaphor for the brain. Brain

Behav Sci 2019:1-44 http://dx.doi.org/10.1017/

Hybrid approaches combine the interpretability of cod- S0140525X19000049. 2019.

This provocative and thoughtful article challenges the usefulness of the

ing-type models with dynamical mechanism by augment-

concept of coding in neural systems.

ing statistical models with a dynamical component that

2. Wark B, Lundstrom BN, Fairhall AL: Sensory adaptation. Curr

allows for time dependence. This could model a specific Opin Neurobiol 2007, 17:423-429.

biophysical mechanism like synaptic replenishment, such

3. Fairhall AL: Adaptation and natural stimulus statistics. The

as the linear-nonlinear-kinetic model of Ozuysal and Cognitive . edn 5. MIT Press; 2014:424-449.

Baccus [42] or nonlinear threshold dynamics [43]. Such

4. Weber AI, Krishnamurthy K, Fairhall AL: Coding principles in

models can provide mechanistic predictions that can be adaptation. Annu Rev Vis Sci 2019, 5:427-449 http://dx.doi.org/

10.1146/annurev-vision-091718-014818.

tested against data [44 ]. One might also incorporate a

This review summarizes and links the phenomenology and various the-

postulated computational principle such as sensitivity of oretical interpretations of adaptation.

the system to multiple dimensions [26,28] or divisive

5. Reinartz S, Biro I, Gal A, Giugliano M, Marom S: Synaptic

suppression [45,46 ]. Such models can act as a bridge dynamics contribute to long-term single neuron response

fluctuations. Front Neural Circuits 2014, 8.

between low-level mechanism and identification of algo-

6. Zenke F, Poole B, Ganguli S: Continual learning through

rithm. As for the statistical models above, one must still

synaptic intelligence. Proc 34th Int Conf Mach Learn 2017, 70.

probe the system with simplified stimuli in order to probe

7. Clemens J, Ozeri-Engelhard N, Murthy M: Fast intensity

algorithmic computational motifs.

adaptation enhances the encoding of in Drosophila. Nat

Commun 2018, 9:1-15.

Outlook

8. Cooke JE, King AJ, Willmore BDB, Schnupp JWH: Contrast gain

We have discussed here a number of model frameworks control in mouse auditory cortex. J Neurophysiol 2018,

120:1872-1884 http://dx.doi.org/10.1152/jn.00847.2017.

that allow one to capture how adaptive dynamics shape

responses to naturalistic inputs, including time- 9. Liu B,Macellaio MV, OsborneLC: Efficient sensorycorticalcoding

optimizes pursuit eye movements. Nat Commun 2016, 7:12759.

dependent coding properties that are modulated by activ-

10. Gepner R, Wolk J, Wadekar DS, Dvali S, Gershow M: Variance

ity and stimulus properties. More generally, these models

adaptationinnavigationaldecisionmaking. eLife2018,7:e37945.

account for encoding of multiple stimulus timescales. In

11. Rasmussen RG, Schwartz A, Chase SM: Dynamic range adaptation

moving from explicit coding models to more complex

in primary motor cortical populations. eLife 2017, 6:1-20.

statistical or dynamical models, one may gain predictive

12. Rustichini A, Conen KE, Cai X, Padoa-schioppa C: Optimal

accuracy but lose interpretability, in particular in terms of

coding and neuronal adaptation in economic decisions. Nat

theories of efficient or predictive coding. Commun 2017, 8:1208.

www.sciencedirect.com Current Opinion in Neurobiology 2019, 58:135–140

140 Computational neuroscience

13. Marom S: Slow changes in the availability of voltage-gated ion 34. Latimer KW, Barbera D, Sokoletsky M, Awwad B, Katz Y, Nelken I,

channels: effects on the dynamics of excitable membranes. J Lampl I, Fairhall AL, Priebe NJ: Multiple timescales account for

Membr Biol 1998, 161:105-113. adaptive responses across sensory cortices. bioRxiv 2019

http://dx.doi.org/10.1101/700062.

14. Lundstrom BN, Higgs MH, Spain WJ, Fairhall AL: Fractional

differentiation by neocortical pyramidal neurons. Nat Neurosci 35. Schwartz G,HarrisR, ShromD, Berry MJ:Detection andprediction

2008, 11:1335-1342. of periodic patterns by the retina. Nat Neurosci 2007, 10:552-554.

15. Pozzorini C, Naud R, Mensi S, Gerstner W: Temporal whitening by 36. McIntosh LT, Maheswaranathan N, Nayebi A, Ganguli S,

power-law adaptation in neocortical neurons. Nat Neurosci 2013, Baccus SA: Deep learning models of the retinal response to

16:942-948. natural scenes. Adv Neural Inf Process Syst 2016, 29:1361-1369.

A multi-layer convolutional neural network is shown to capture responses

16. Lundstrom BN, Fairhall AL, Maravall M: Multiple timescale

of retinal ganglion cells to natural movies better than other classes of

encoding ofslowlyvaryingwhiskerstimulus envelopeincortical

models, including LN models and GLMs. Inclusion of a recurrent layer is

and thalamic neurons in vivo. J Neurosci 2010, 30:5071-5077.

necessary to capture contrast adaptation effects.

17. Aljadeff J, Lansdell BJ, Fairhall AL, Kleinfeld D: Analysis of neuronal

37. Maheswaranathan N, Kastner DB, Baccus SA, Ganguli S:

spike trains, deconstructed. Neuron 2016, 91:221-259.

Inferring hidden structure in multilayered neural circuits. PLoS

This primer compares different models used to characterize neural spike

Comput Biol 2018, 14:1-30.

trains, focusing on variants of the linear-nonlinear cascade framework,

including models that incorporate multiple stimulus features (such as 38. Ocko SA, Lindsey J, Ganguli S, Deny S: The emergence of

through spike-triggered covariance methods) and models that incorpo- multiple retinal cell types through efficient coding of natural

rate history dependence (such as generalized linear models). movies. Advances in Neural Information Processing Systems 2018

http://dx.doi.org/10.1101/458737.

18. Smirnakis S, Berry MJ, Warland DK, Bialek W, Meister M:

Adaptation of retinal processing to image contrast and spatial 39. Benda J, Herz AVM: A universal model for spike-frequency

scale. Nature 1997, 386:69-73. adaptation. Neural Comput 2003, 15:2523-2564.

19. Fairhall AL, Lewen G, Bialek W, de Ruyter Van Steveninck R: 40. del Mar Quiroga M, Morris AP, Krekelberg B: Adaptation without

Efficiency and ambiguity in an adaptive neural code. Nature plasticity. Cell Rep 2016, 17:58-68.

2001, 412:787-792. Upon exposure to a particular stimulus orientation, a recurrent neural

network model of orientation tuning shows tuning curve shifts similar to

20. Baccus SA, Meister M: Fast and slow contrast adaptation in

those observed in primary . These tuning curve shifts occur

retinal circuitry. Neuron 2002, 36:909-919.

without any change in network properties or explicit adaptive mechanisms.

21. Bair W, Movshon JA: Adaptive temporal integration of motion in

41. Zambrano D, Nusselder R, Scholte HS, Bohte S: Sparse

direction-selective neurons in macaque visual cortex. J

computation in adaptive artificial spiking neural networks.

Neurosci 2004, 24:7305-7323.

Front Neurosci 2019 http://dx.doi.org/10.3389/fnins.2018.00987.

22. Mease RA, Lee S, Moritz AT, Powers RK, Binder MD, Fairhall AL:

42. Ozuysal Y, Baccus SA: Linking the computational structure of

Context-dependent coding in single neurons. J Comput

variance adaptation to biophysical mechanisms. Neuron 2012,

Neurosci 2014, 37:459-480. 73:1002-1015.

23. Hill DN, Curtis JC, Moore JD, Kleinfeld D:

43. Mensi S, Hagens O, Gerstner W, Pozzorini C: Enhanced

reports efferent control of vibrissa position on multiple time

sensitivity to rapid input fluctuations by nonlinear threshold

scales. Neuron 2011, 72:344-356.

dynamics in neocortical pyramidal neurons. PLoS Comput Biol

2016, 12:e1004761.

24. Panzeri S, Brunel N, Logothetis NK, Kayser C: Sensory neural

codes using multiplexed temporal scales. Trends Neurosci

44. Ozuysal Y, Kastner DB, Baccus SA: Adaptive feature detection

2010, 33:111-120.

from differential processing in parallel retinal pathways. PLoS

Comput Biol 2018, 14:e1006560.

25. Gaudry KS, Reinagel P: Contrast adaptation in a nonadapting

The authors use a novel model formulation to identify different mechan-

LGN model. J Neurophysiol 2007, 98:1287-1296.

isms in On and Off pathways of the retina that contribute to contrast

26. Hong S, Agu¨ era y Arcas B, Fairhall AL: Single neuron adaptation.

computation: from dynamical system to feature detector.

45. Schwartz O, Simoncelli EP: Natural sound statistics and divisive

Neural Comput 2007, 19:3133-3172.

normalization in the . Adv Neural Inf Process

27. Pillow JW, Simoncelli EP: Biases in white analysis due to non- Syst 2000, 13:27-30.

poisson spike generation. Neurocomputing 2003, 52–54:109-115.

46. Cui Y, Wang YV, Park SJH, Demb JB, Butts DA: Divisive

28. Borst A, Flanagin VL, Sompolinsky H: Adaptation without suppression explains high precision firing and contrast

parameter change: dynamic gain control in motion detection. adaptation in retinal ganglion cells. eLife 2016, 5:1-25.

Proc Natl Acad Sci U S A 2005, 102:6172-6176. A single model that incorporates a divisive interaction between excitatory

and suppressive LN components can account for ganglion cell responses

29. Famulare M, Fairhall AL: Feature selection in simple neurons: at high and low contrast.

how coding depends on spiking dynamics. Neural Comput

2010, 22:581-598. 47. Harper N, McAlpine D: Optimal neural population coding of an

auditory spatial cue. Nature 2004, 430:682-686.

30. Mease RA, Famulare M, Gjorgjieva J, Moody WJ, Fairhall AL:

Emergence of adaptive computation by single neurons in the 48. Solomon SG, Kohn A: Moving sensory adaptation beyond

developing cortex. J Neurosci 2013, 33:12154-12170. suppressive effects in single neurons. Curr Biol 2014, 24:

R1012-R1022.

31. Ostojic S, Brunel N: From spiking neuron models to linear-

nonlinear models. PLoS Comput Biol 2011, 7:e1001056 http://dx. 49. Urdapilleta E, Samengo I: Effects of spike-triggered negative

doi.org/10.1371/journal.pcbi.1001056. feedback on receptive-field properties. J Comput Neurosci

2015, 38:405-425.

32. Weber AI, Pillow JW: Capturing the dynamical repertoire of

single neurons with generalized linear models. Neural Comput 50. Whitmire CJ, Stanley GB: Rapid sensory adaptation redux: a

2017, 29:3260-3289. circuit perspective. Neuron 2016, 92:298-315.

This review discusses mechanisms that give rise to adaptive phenomena,

33. Homann J, Koay SA, Glidden AM, Tank DW, Berry MJ: Predictive

with emphasis on how mechanisms operate at, and potentially interact

coding of novel versus familiar stimuli in the primary visual

on, multiple spatial scales.

cortex. bioRxiv 2017 http://dx.doi.org/10.1101/197608.

Neurons in mouse primary visual cortex show adaptation to repeated 51. Snow M, Coen-Cagli R, Schwartz O: Adaptation in the visual

presentations of the same sequence of visual stimuli and elevated cortex: a case for probing neuronal populations with natural

responses to novel or misplaced stimuli in the sequence. stimuli. F1000Research 2017, 6:1246.

Current Opinion in Neurobiology 2019, 58:135–140 www.sciencedirect.com