Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics
Total Page:16
File Type:pdf, Size:1020Kb
entropy Review Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics Rodrigo Cofré 1,* , Cesar Maldonado 2 and Bruno Cessac 3 1 CIMFAV-Ingemat, Facultad de Ingeniería, Universidad de Valparaíso, Valparaíso 2340000, Chile 2 IPICYT/División de Matemáticas Aplicadas, San Luis Potosí 78216, Mexico; [email protected] 3 Inria Biovision Team and Neuromod Institute, Université Côte d’Azur, 06901 CEDEX Inria, France; [email protected] * Correspondence: [email protected] Received: 9 October 2020; Accepted: 15 November 2020; Published: 23 November 2020 Abstract: The Thermodynamic Formalism provides a rigorous mathematical framework for studying quantitative and qualitative aspects of dynamical systems. At its core, there is a variational principle that corresponds, in its simplest form, to the Maximum Entropy principle. It is used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of science. In particular, it has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, in order to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism. Keywords: Thermodynamic Formalism; neuronal networks dynamics; maximum entropy principle; free energy and pressure; linear response; large deviations; ergodic theory 1. Introduction Initiated by Boltzmann [1,2], the goal of statistical physics was to establish a link between the microscopic mechanical description of interacting particles in a gas or a fluid, and the macroscopic description that is provided by thermodynamics [3,4]. Although this program is, even nowadays, far from being completed [1,5], the work of Boltzmann and his successors opened new avenues of research, not only in physics but also in mathematics. Especially the term “ergodic”, which was coined by Boltzmann [1], inaugurated an important branch of mathematics that provides a rigorous link between the description of dynamical systems in terms of their trajectories and the description in terms of statistics of orbits and, more generally, between dynamical systems theory and probability theory. At the core of the ergodic theory, there is a set of “natural” dynamically invariant probability measures in the phase space, somewhat generalising the Liouville distribution for conservative systems with strong analogies with Gibbs distributions in statistical physics [6,7]. This strong connection, in particular, gave birth to the so-called Thermodynamic Formalism. The introduction of Thermodynamic Formalism that occurred in the 1970s was primarily due to Yakov Sinai, David Ruelle, and Rufus Bowen [8–10]. The development of Thermodynamic Formalism initially served to derive rigorous criteria characterising the existence and uniqueness of the Gibbs Entropy 2020, 22, 1330; doi:10.3390/e22111330 www.mdpi.com/journal/entropy Entropy 2020, 22, 1330 2 of 42 states in the infinite volume limit. Although Gibbs states and equilibrium states (see Section 2.1.1) are naturally defined in finite volume systems, the extension to infinite volume (“thermodynamic limit”) is far from straightforward. Indeed, it does not follow from the Carathéodory or Kolmogorov extension theorems [11,12], that the equilibrium states of the infinite volume define a measure, as there is no way to express the marginals associated to an infinite-volume Gibbs measure without making explicit reference to the measure itself [12]. When considering conditional probabilities, rather than marginals, Dobrushin, Lanford, and Ruelle led to a different consistency condition that affords for the building of infinite volume Gibbs measures [7]. In the context of dynamical systems, Sinai, Ruelle, and Bowen were able to connect the theory of hyperbolic (Anosov) dynamical systems to results in statistical mechanics. Indeed, Sinai found an unexpected link between the equilibrium statistical mechanics of spin systems and the ergodic theory of Anosov systems by a codification using Markov partitions (see Section 2.1.1 for details). This idea was later extended for a much more general class of hyperbolic systems [10,13,14]. While the Thermodynamic Formalism started as a branch of rigorous statistical mechanics, nowadays it is viewed from different communities as a branch of dynamical systems or stochastic processes. There have been a few attempts to use Thermodynamic Formalism in different ways other than as a natural mathematical foundation of statistical mechanics, for example, studying population dynamics [15,16], self-organised criticality [17], the relative abundance of amino acids across diverse proteomes [18], analyse the difference between introns and exons in genetic sequences [19,20], coding sequence density estimation in genomes [21], and statistics of spike trains in neural systems [22–28], which is the main topic of this review. Neuronal networks are biological systems whose components, such as neurons, synapses, ionic channels, ..., are ruled by the laws of physics and are written in terms of differential equations; hence, are dynamical systems. On the other hand, because of their large dimensionality, it is natural to attempt to characterise neuronal networks using methods inspired by statistical physics, for example, mean-field methods [29–32], density methods [33] or Fokker-Planck equations [34]. Most neurons produce short electrical pulses, called action potentials or spikes, and it is widely believed that the collective spike trains emitted by neuronal networks encode information about the underlying dynamics and response to stimuli [35–37]. Thus, researchers have devoted a lot of effort to understanding the correlations structure in the statistics of spike trains [38–45]. Because spikes can be represented as binary variables, it is natural to adapt methods and concepts from statistical physics, and more specifically the statistical physics of spin systems, to analyse spike trains statistics. There have been many successful attempts in this direction. All of the approaches we know about are based on variational principles. The most direct connection from statistical physics to spike train statistics is done via the maximum entropy principle which has attracted a lot of attention during the past years [38,39,43,46,47] (see [47] for a physicists-oriented review). Unfortunately, most of these articles are limited to the original form of an Ising spin-glass potential (pairwise interactions with random couplings) or variants of it with higher order interactions [40–42], where successive times are independent, thereby neglecting the time correlations and causality one may expect from a network of neurons with interactions (exceptions can be found in [48–50]). We focus on this approach and its extension to causal networks in the present review as it is natural to link with the Thermodynamic Formalism. Another approach, which actually appeared earlier in mathematical neuroscience, is the dynamic mean-field theory that takes into account dynamics and time correlations. In this approach, originated in quantum field theory and Martin–Siggia–Rose formalism (Statistical dynamics of classical systems [51]), the variational principle is expressed via the minimisation of an effective action containing the equations for the dynamics. It was introduced in the field of theoretical neuroscience by Sompolinsky who initially applied it to the study of spin-glasses dynamics [52–54], before analysing neuronal networks dynamics [29]. Here, the effective action can be summarised, in the limit when the number of neurons tends to infinity, by dynamic mean-field Entropy 2020, 22, 1330 3 of 42 equations depending on neuron’s firing rates and pairwise time-correlations. Thus, here, the information about spike statistics is contained in the two first moments (Gaussian limit). This approach has inspired a lot of important works, where temporal correlations are taken into account, see, for example [55–57], or the recent work by M. Helias and collaborators (i.e., the review on dynamic mean-field theory in [58] and the link to from dynamic mean-field theory to large deviations from Ben-Arous and Guionnet [59] and [60]). Another trend attempts to relate neuronal dynamics to spike statistics, expressed via a maximum-likelihood approach [61] (we apologise if we have forgotten other approaches that we ignore). To our taste, the most achieved work in this direction is Amari’s information geometry [62] making a beautiful link between probability measures (e.g., exponential, like Gibbs measures) and differential geometry. In this review, we show how Thermodynamic Formalism can be used as an alternative way to study the link between the neuronal dynamics and spike statistics, not only from experimental data, but also from mathematical models of neuronal networks, properly handling causal and memory dependent interactions between neurons and their spikes. It is also based on a variational principle (and, actually, the four methods that are discussed in this introduction certainly have a common “hat”, the large deviations theory [63]), although we depart from this principle at some point where we discuss extensions of