Nonlinear Characteristics of Neural Signals

Z.Zh. Zhanabaeva,b, T.Yu. Grevtsevaa,b, Y.T. Kozhagulovc,* a National Nanotechnology Open Laboratory, Al-Farabi Kazakh National University, al-Farabi Avenue, 71, Almaty 050038, Kazakhstan [email protected] b Scientific-Research Institute of Experimental and theoretical physics, Al-Farabi Kazakh National University, al-Farabi Avenue, 71, Almaty 050038, Kazakhstan [email protected] c Department of Solid State Physics and Nonlinear Physics, Faculty of Physics and Technology al-Farabi Kazakh National University, al-Farabi Avenue, 71 Almaty 050038, Kazakhstan [email protected]

ABSTRACT quantities set are used as quantitative characteristics of chaos The study is devoted to definition of generalized metrical and [18,19]. A possibility to describe different types of behaviors topological (informational entropy) characteristics of neural signals of neuronal action potentials via fractal measures is described via their well-known theoretical models. We have shown that time in [16]. So, we can formulate a problem: do any entropic laws dependence of of is scale invariant. Information and entropy of neural signals have constant values in describing neurons dynamics exist? Obviously, in case of self- case of self-similarity and self-affinity. similarity (similarity coefficients for the different variables are Keywords: Neural networks, information, entropy, scale equal each other) and self-affinity (similarity coefficients are invariance, metrical, topological characteristics. different) of fractal measures we expect to find the existence of intervals with constant entropy. Formulation of two 1. Introduction additional laws (fractal and entropic) would help us to build a more efficient neural networks for classification, recognition, Study of artificial neural networks is one of the most urgent identification, and processes control. problems in information technology. Generally, neural Purpose of the present work is to establish informational- networks should have the basic properties of an ensemble of entropic laws for the description of neural oscillations. biological neurons: associativity, resistance to noise, distributed nature of information storage and processing, 2. Equations for action potentials of neurons adaptability to relationships between elements. This was reported in many reviews and articles [1-4]. Results of recent There are various models of neural oscillations [13-16]. researches indicate to the possibility for using of neural The FitzHugh-Nagumo model [13] consists of two oscillations for routing of information [5-7]. equations, one governing a voltage-like variable v having a However, using of biological model (for example, cubic nonlinearity and a slower recovery variable w. It can be functioning model of a neuron) involves analysis written as: of great number of systems of differential equations 3 containing big number of parameters with the same order [8- v v v  w  I 10]. Thus, it’s necessary to find ways for establishment of 3 ext (1) universal, the most common and simple laws of neuron  w  v  a  b  w. dynamics. Application of methods of nonlinear dynamics for the description of neuron networks by use of systems of differential equations are presented in [11,12]. The parameter Iext models the input current the neuron There are certain theoretical models [13-16] describing the receives; the parameters a, b > 0 and  > 0 describe the following regularities. A neuron moves from the stable state to kinetics of the recovery variable w . an excited state because of external stimulus (external field). The Hindmarsh-Rose models [14] equation can be written This process is characterized by appearance of spikes and as: bursts. A neuron also can generate quasi-periodic, chaotic and noise-like oscillations [16], along with the bursting oscillations. Nonlinear dynamics [17] describes chaotic phenomena on the base of fractal and informational-entropic laws. Informational entropy and fractal dimensions of physical x y  a  x32  b  x  z  I Let us consider  as parameter of order. So, we can choose

2 M as y c  d  x  y (2) z r  s  x  x  z , M      R   M  1  (5) M , MM where: I mimics the membrane input current for biological neurons; b allows one to switch between bursting and spiking M   M behaviors and to control the spiking frequency;  controls the M  1   , (6) speed of variation of the slow variable z, s governs adaptation:  a unitary value of s determines spiking behavior without where indexes and correspond to the norms , accommodation and subthreshold adaptation, xR sets the resting potential of the system. taken as MM*  ,  . According to (5) and (6) we can Modeling of spiking-bursting neural behavior is possible rewrite the formula (4) as via two-dimensional map described in [15]:   x f(,) x y MM1, i1 i i M 0  (7) , M y y  ( x  1)     i1 i i   (1x )  y ; x  0 M

 MM 0 1. f( x , y ) y ; 0  x   y (3)  (8)   1;xy   , At we have M , it corresponds to the   M MM 0 where xn is the fast and yn is the slow dynamical variable. meaning of M 0 . At   0 we have M M MM0 ,0 . It Slow time evolution of is due to small values of the means that the fractal measure defined by its own norm exists in a case when external influence characterized by parameter parameter   0.001. The term  n can be used as the control  is absent. parameter to select the regime of individual behavior.  is a Let us apply the formula (7) and (8) for the description of control parameter of the map. the action potential of neurons. We use the simplified In the work [16] action potentials of neurons were taken in notations of M M MV, M 00 V ,   Ft(), M   VF . the form of fractal measures. We consider a fractal measure as From (7), (8) we obtain the expressions for the potential of a physical quantity characterized by an additive and a a neuron as measurable set. As opposite to the well-known theories of fractals we choose value of scale of measurement not  randomly, but as a relative difference between the desired and Ft() V f V0, V , f V 0 , V  V 0  1  , (9) the external (control) parameter. Hence, the fractal measure is V a nonlinear function depending on the process (object).  The traditional definition of fractal measure M can be V F written as V f V0,FFFFF, V , f V 0, , V  V 0,  1  , (10) Ft()  M M0*  M M ,  D  d ,  0 , (4) where VV, are the threshold excitation potentials. Neurons 0 0,F have inherent properties of quasi-particles, they can’t exist where M is a regular (non-fractal) measure, M is a scale 0 without movement and outside of medium, and can be considered as fluctuation of the medium. Neurons of measurements, M * is norm of M , D is fractal communicate by sending action potentials. Therefore it is dimension of the set of values of M , d is topological natural to assume that the action potentials has modulation - dimension of norm carrier. M is independent on M , periodic nature, and in equations (9), (10) we can take the therefore, measure defining by (1) can be tentatively called the external field in the form linear value. The dependence of M from A assumes the existence of certain conditions in the form of external F(t)  A(1 Bsin(t)) , (11) disturbance, generally it is a order parameter. where is amplitude, coefficient (deeps of modulation), where Sx() is absolute information entropy of an event x frequency modulation of neural oscillations. and S(/) x y is conditional entropy of an event x when For the system consisting of N neurons, we can write the another event is to have occurred. Equation (15) can be following equations in iterative form as y  used for solving of technical problems such as for estimation N k ()()()()k k k k V V1 F ( t ) V , (12) of transmission capacity (in communication channels). ii10 k 1 Informational entropy which is Shannon entropy Sx() can be defined as mean value of information as  N k ()()()()k k k k V V1 V F ( t ) , (13) ii10 S() x Pi ()() x I i x   P i ()ln() x P i x . (16) k 1  ii where k is number of neurons, Vi is action potential of Here, i is number of a cell after segmentation of x . So, let neurons, V is threshold excitation potentials, Ft() – 0 us use (14) as a main definition of information. modulated stimulus value of one neuron,  k is the difference According to (16) value of entropy calculated via between fractal and topological dimension of the set of values probability density tends to infinity if x is a continuous value. Let us try to define scale-invariant regularities. So, we have to V . Equation (12) takes into account a possibility for own sub- i use a new approach for the description of information threshold neuron oscillations at Ft( ) 0 , and (13) is used phenomena. Because of this fact we can use information as a only for the description of presence of the stimulus Ft( ) 0 . independent defining variable. Statistical characteristics of a process can be described via information. So, we can try to Ft() can be caused by action potentials of neighboring find new properties of information independent on scale of neurons. measurement. Equations (12) and (13) describe experimentally observed Therefore, according to (14) we shall describe probability of variety of spikes, chaotic oscillations, phase synchronization realization of information PI() as after the bursts [16].

 I 3. Informational – entropic characteristics of neural P() I e . (17) signals Probability function fI() can be defined via the following Conception of information is widely used in cybernetics, relations: genetics, sociology, and so on. Development of open system physics stimulates formulation of universal definition of  information which can be used in different branches of 0P ()1,0 I   I   ; f () I dI  1, science. Definition of open system contains conception of 0 (18) information: open system is a system in which energy, matter  and information are exchanged with its environment. PI()(),()() fIdIfI  PI  e I . As usual, definition of a complex object includes list of its I main properties. Information Ix() for statistical realization of a physical value x is greater than zero and can be defined for Probability function PI() equals to density of probability distribution function fI(). Information defined via (14) is a non-equilibrium state ( I()() x I x0 if xx 0 ). Let us characterized by property of scale invariance. It means that a consider that Px() is probability of realization of the whole object and its parts have the same law of distribution. variable x . So, quantity of information can be described as Informational entropy SI() of distribution of information can

be defined as mean value of information as I( x ) ln P ( x ) . (14)  Reiteration and non-equilibrium character of a process can S( I ) If ( I ) dI  (1  I ) e I . (19) be taken into account by the condition 0Px ( ) 1. A lot of I definitions of information have been suggested in different Values of entropy can be normalized to unit, so, for branches of science, but (14) corresponds to all of them. 0 I   we have 10S . It is well-known that entropy Information can be defined as of a continuous set tends to infinity at jumping values of

variables. Therefore, we must calculate the integral by use of I(/)()(/) x y S x S x y , (15) the Lebesgue measure. Describing information as the measure we have got (19). Let us consider that a scale-invariant function gx()  I at I 1 from the same equation we have e I, I I1 . justifies to the well-known functional equation as Therefore, we can use only (22) for the description of regularities of self-affinity, dynamical equilibrium and self- g( x )  g ( g ( x / )) , (20) similarity of dynamical systems. where  is scaling factor. All continuous functions in theirs 4. Generalized metrical characteristic of neural signals fixed points justify (20). We use fI() and SI() as characteristic functions. Fixed points of the functions are [20]: In Part 3 of the present work we have described informational-entropic characteristics of fractal (non-  I Euclidean) signals. Let us consider generalization of metrical f( I ) I , e  I , I  I1  0.567 , (21) characteristic for the description of fractal (neural) signals. The generalized metrical characteristic is followed from the S( I ) I ,(1  I ) e I  I , I  I  0.806 . (22) 2 well-known Hölder’s inequality for different functions [21, 22]. The fixed points are limits for the following infinite maps 11 TTpq I f( I ), limexp(  exp(...  exp( I )...))  I (23) 11p  q  ii1 0 1 x t dt x t dt  i ij      TT 00    , (26) ISIIII( ), lim exp(  exp(...  exp(ln(  1)  )...))  T ii1 0 0 2 (24) pq, 1 1 1 i  Kx, x x i t x j  t dt,1   ij  T0 p q at any initial values I 0 , shown in Fig. 1. Number of brackets equals to i 1 . where K pq, is a coefficient, at constant value of this xxij,

coefficient we have equilibrium in (26), xij( t ), x ( t ) are physical values depending on time t, T is characteristic time sufficient to establishment of statistical regularities, p and q are parameters of the system. At p=q=2 value of K 2,2 xxij, characterizes Euclidean metric of a set of values of functions

xij( t ), x ( t ) . At xtj ( ) 1 we have form factor of the signal

x()() t xi t given as

12  xt2 ()  K  . (27) xt() xt() Fig. 1. Establishment of fixed values of information and entropy. Here we have averaged values over the system. Parameter

Interpretations of physical meaning of numbers I1  0.567 K xt() is used in radio engineering. Corresponding choice of

pq, and I2  0.806 can be different. Probability density is a local values of p,q let us to use K for the description of fractal xxij, (instant) characteristic. Therefore, it can be different for signals. Let us designate D as fractal dimension of the curve different variables. So, I can be used as a criterion of self- 1 xt(). So, we can express p D, q  D ( D  1). Using affinity. Entropy is an averaged characteristic. Therefore, I 2 xij x( t ), x t we have can be considered as a criterion of self-similarity. 1 q On the other hand, numbers I and I can be considered as D 1 D q 1 2 xx    j  analog of the Fibonacci number I  0.618 (“golden section” KDq, , q  D ( D  1) . (28) 20 xt, xt of dynamical measure) for statistical self-affine and self- similar systems correspondently. From (22) at I  1 we have Dq, So, values of K xt, for signals with different coefficients of 2 similarity xt, can be used for distinguishing of various self- (1IIIIIII )(1  )  ,   1  0, 20  0.618 , (25) affine neural oscillations.

5. Metric-topological diagram of neural signals KK Dq, contains fractal dimension D of curve xt(). xt, Value of the fractal dimension can be defined via the Dependence of normalized informational entropy Sx() following expression [17] as (calculated via (16)) of neural signals belongs to different types of corresponding values of metrical characteristic K Dq, lnC ( ) xt, D  lim , (calculated via (28)) is shown in Fig. 2. Entropy has been  0 ln1  defined via probabilities of entering of values (action potential 1 NN C( ) lim  (   x  x ) . (29) of neurons) to the range  x  x, i  1,2,... , number of 2  ij ii1 N  N i1 j 1 3 42 counts is jN 1,... , N  10 ;  10 10 . Let us ij consider impulses with different shape. Entropy of an impulse For clarity, each region of the diagram SK() contains with shape of isosceles triangle is maximal because the examples of neural signals according to corresponding models distribution xt() is uniform (linear). So, we have choose S ij  ( – FitzHugh-Nagumo model [13],  – Hindmarsh-Rose as a norm of entropy. S() increases at decreasing of  , but model [14], – modeling of spiking-bursting neural behavior using two-dimensional map [15],  – general model value SS()() in the selected range is practically constant. of scale invariant model of neural networks [16] ).

Fig. 2. Dependence of normalized entropy on generalized metrical characteristic for models of neural signals ( – FitzHugh-Nagumo model (1) a1.5, b  1,  12.5, I  1  2 , – Hindmarsh-Rose model (2) , –   a3, b   1, c  1, d  5, xR  1.6, r  0.005, s  4, I  1.3  3 modeling of spiking-bursting neural behavior using two-dimensional map (3) 0.001,   0.14,   4  6, – general model of scale invariant model

of neural networks (12) ABV0.8,  0.8,0  0.1,  0.433  1 ).

As follows from the diagram SK(), the scale-invariant fixed points of neural signals which can be considered as signals described via the fractal model are exactly self-affine criteria of self-affinity and self-similarity. Theoretical analysis signals (with big values of K), and normalized entropy has big of experimental data shows that neural signals are characterized by self-affine regularities. values in the range ISI of fixed points S and I . At 12 12K oscillations are close to , and their Acknowledgments entropy SI 2 . The author declares that there is no conflict of interest In case of sufficient resolution (  , N ) the dependence regarding the publication of this article. SK() can be used for unambiguous identification of neural References signals and for a criterion of solution for output layer of a neural network. [1] E. M. Izhikevich, “Dynamical systems in : the geometry of excitability and bursting”, The MIT Press, Cambridge, Massachusetts, 6. Conclusions (2010). [2] A. Gupta, L. N. Long. “Character recognition using spiking neural networks”, IEEE International Joint Conference on Neural Networks, In this work we have shown the possibility for classifying (2007) 53-58, doi: 10.1109/IJCNN.2007.4370930, ISSN: 1098-7576,. of neural signals according to their nonlinear (entropic and generalized-metric) characteristics. We have defined stable, [3] M. N. Shadlen, W. T. Newsome, “The variable discharge of cortical [13] J. Baladron, D. Fasoli, O. Faugeras, J. Touboul, “Mean-field description neurons: implications for connectivity, computation, and information and propagation of chaos in networks of Hodgkin-Huxley and FitzHugh- coding”, The Journal of Neuroscience, vol. 18(10), (1998) 3870-3896,. Nagumo neurons”, The Journal of Mathematical Neuroscience, vol. [4] T. M. Reese, A. Brzoska, D. T. Yott, D. J. Kelleher, “Analyzing self- 2(1), (2012). similar and fractal properties of the C. elegans neural network”, PLoS [14] M. Storace, D. Linaro, E. de Lange, “The Hindmarsh–Rose neuron ONE, vol. 7(10), (2012), doi:10.1371. model: bifurcation analysis and piecewise-linear approximations”, [5] S. Carrillo, J. Harkin, L. McDaid, S. Pande, S. Cawley, B. McGinley, F. Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 18(3), Morgan, “Advancing interconnect density for (2008). hardware implementations using traffic-aware adaptive network-on-chip [15] N.F. Rulkov, “Modeling of spiking-bursting neural behavior using two- routers”, Neural networks, vol. 33, (2012) pp. 42-57. dimensional map”, PhysRevE, vol. 65, (2002). [6] S. Pierre, H. Said, W. G. Probst. “An artificial neural network approach [16] Z. Zh. Zhanabaev, Y. T. Kozhagulov, “A Generic model for scale– for routing in distributed computer networks”, Engineering Applications invariant neural networks”, Journal of Neuroscience and of Artificial Intelligence, Vol. 14, No 1, (2001) pp. 51-60. Neuroengineering, vol. 2(3), (2013) 267-271. [7] G. F. de Arruda, E. Cozzo, Y. Moreno On degree-degree correlations in [17] H. G. Schuster, W. Just “Deterministic Chaos: An Introduction”, Fourth multilayer networks, Physica D 323-324, (2016) pp. 5 – 11. edition, WILEY-VCH Verlag GmbH & Co. KGaA, (2005) 299. [8] E De Schutter: Why are computational neuroscience and systems [18] M. Prokopenko, C. Gershenson Entropy Methods in Guided Self- biology so separate? PLoS computational biology, Vol. 4:e1000078 Organisation // Entropy 16, (2014) 5232-5241. doi:10.3390/e16105232 (2008) [19] W. Slomczynski, J. Kwapier, K. Zyczkowski Entropy Computing Via [9] E De Schutter, JD Angstadt, RL Calabrese: A model of graded synaptic Integration over Fractal Measures. Chaos. –Vol. 10, №1, (2000) 180- transmission for use in dynamic network simulations. Journal of 188. neurophysiology 69, (1993) 1225-35. [20] Z.Zh. Zhanabaev, “Information properties of self-organizing systems”, [10] R.M. Eichler West, E. De Schutter, G.L. Wilcox: Using evolutionary Rep. Nat. Acad. Of Science RK. Vol. 5, (1996) 14-19. algorithms to search for control parameters in a nonlinear partial [21] Z. Zh. Zhanabaev, “Obobshchennaya metricheskaya kharakteristika . Evolutionary Algorithms 111, (1999) 33-64. dinamicheskogo chaosa”, Materialy VIII Mezhdunarodnoi shkoly [11] M. Forti, M Grazzini, P. Nistri, L. Pancioni Generalized Lyapunov Khaoticheskie avtokolebanyai obrazovanie struktur, Saratov, (2007) 67- approach for convergence of neural networks with discontinuous or non- 68. Lipschitz activations, Physica D 214, (2006) 88 – 99. [22] Z. Zh. Zhanabaev, S. N. Аkhtanov, “New method for investigating of [12] T. Ivancevic, L. Jain, J. Pattison, A. Haris, “Nonlinear dynamics and bifurcation regimes by use of realization of a dynamical system”, chaos methods in neurodynamics and complex data analysis”, Nonlinear Eurasian Physical Technical Journal, vol. 12, № 2(24), (2015) 10-16. Dyn, vol. 56, (2009) 23–44,.