Nonlinear Characteristics of Neural Signals
Total Page:16
File Type:pdf, Size:1020Kb
Nonlinear Characteristics of Neural Signals Z.Zh. Zhanabaeva,b, T.Yu. Grevtsevaa,b, Y.T. Kozhagulovc,* a National Nanotechnology Open Laboratory, Al-Farabi Kazakh National University, al-Farabi Avenue, 71, Almaty 050038, Kazakhstan [email protected] b Scientific-Research Institute of Experimental and theoretical physics, Al-Farabi Kazakh National University, al-Farabi Avenue, 71, Almaty 050038, Kazakhstan [email protected] c Department of Solid State Physics and Nonlinear Physics, Faculty of Physics and Technology al-Farabi Kazakh National University, al-Farabi Avenue, 71 Almaty 050038, Kazakhstan [email protected] ABSTRACT quantities set are used as quantitative characteristics of chaos The study is devoted to definition of generalized metrical and [18,19]. A possibility to describe different types of behaviors topological (informational entropy) characteristics of neural signals of neuronal action potentials via fractal measures is described via their well-known theoretical models. We have shown that time in [16]. So, we can formulate a problem: do any entropic laws dependence of action potential of neurons is scale invariant. Information and entropy of neural signals have constant values in describing neurons dynamics exist? Obviously, in case of self- case of self-similarity and self-affinity. similarity (similarity coefficients for the different variables are Keywords: Neural networks, information, entropy, scale equal each other) and self-affinity (similarity coefficients are invariance, metrical, topological characteristics. different) of fractal measures we expect to find the existence of intervals with constant entropy. Formulation of two 1. Introduction additional laws (fractal and entropic) would help us to build a more efficient neural networks for classification, recognition, Study of artificial neural networks is one of the most urgent identification, and processes control. problems in information technology. Generally, neural Purpose of the present work is to establish informational- networks should have the basic properties of an ensemble of entropic laws for the description of neural oscillations. biological neurons: associativity, resistance to noise, distributed nature of information storage and processing, 2. Equations for action potentials of neurons adaptability to relationships between elements. This was reported in many reviews and articles [1-4]. Results of recent There are various models of neural oscillations [13-16]. researches indicate to the possibility for using of neural The FitzHugh-Nagumo model [13] consists of two oscillations for routing of information [5-7]. equations, one governing a voltage-like variable v having a However, using of biological neuron model (for example, cubic nonlinearity and a slower recovery variable w. It can be ion channel functioning model of a neuron) involves analysis written as: of great number of systems of differential equations 3 containing big number of parameters with the same order [8- v v v w I 10]. Thus, it’s necessary to find ways for establishment of 3 ext (1) universal, the most common and simple laws of neuron w v a b w. dynamics. Application of methods of nonlinear dynamics for the description of neuron networks by use of systems of differential equations are presented in [11,12]. The parameter Iext models the input current the neuron There are certain theoretical models [13-16] describing the receives; the parameters a, b > 0 and > 0 describe the following regularities. A neuron moves from the stable state to kinetics of the recovery variable w . an excited state because of external stimulus (external field). The Hindmarsh-Rose models [14] equation can be written This process is characterized by appearance of spikes and as: bursts. A neuron also can generate quasi-periodic, chaotic and noise-like oscillations [16], along with the bursting oscillations. Nonlinear dynamics [17] describes chaotic phenomena on the base of fractal and informational-entropic laws. Informational entropy and fractal dimensions of physical x y a x32 b x z I Let us consider as parameter of order. So, we can choose 2 M as y c d x y (2) z r s x x z , M R M 1 (5) M , MM where: I mimics the membrane input current for biological neurons; b allows one to switch between bursting and spiking M M behaviors and to control the spiking frequency; controls the M 1 , (6) speed of variation of the slow variable z, s governs adaptation: a unitary value of s determines spiking behavior without where indexes and correspond to the norms , accommodation and subthreshold adaptation, xR sets the resting potential of the system. taken as MM* , . According to (5) and (6) we can Modeling of spiking-bursting neural behavior is possible rewrite the formula (4) as via two-dimensional map described in [15]: x f(,) x y MM1, i1 i i M 0 (7) , M y y ( x 1) i1 i i (1x ) y ; x 0 M MM 0 1. f( x , y ) y ; 0 x y (3) (8) 1;xy , At we have M MM, it corresponds to the M 0 where xn is the fast and yn is the slow dynamical variable. meaning of M 0 . At 0 we have M M MM0 ,0 . It Slow time evolution of is due to small values of the means that the fractal measure defined by its own norm exists in a case when external influence characterized by parameter parameter 0.001. The term n can be used as the control is absent. parameter to select the regime of individual behavior. is a Let us apply the formula (7) and (8) for the description of control parameter of the map. the action potential of neurons. We use the simplified In the work [16] action potentials of neurons were taken in notations of M M MV, M 00 V , Ft(), M VF . the form of fractal measures. We consider a fractal measure as From (7), (8) we obtain the expressions for the potential of a physical quantity characterized by an additive and a a neuron as measurable set. As opposite to the well-known theories of fractals we choose value of scale of measurement not randomly, but as a relative difference between the desired and Ft() V f V0, V , f V 0 , V V 0 1 , (9) the external (control) parameter. Hence, the fractal measure is V a nonlinear function depending on the process (object). The traditional definition of fractal measure M can be VF written as V f V0,FFFFF, V , f V 0, , V V 0, 1 , (10) Ft() M M0* M M , D d , 0 , (4) where VV, are the threshold excitation potentials. Neurons 0 0,F have inherent properties of quasi-particles, they can’t exist where M is a regular (non-fractal) measure, M is a scale 0 without movement and outside of medium, and can be considered as fluctuation of the medium. Neurons of measurements, M * is norm of M , D is fractal communicate by sending action potentials. Therefore it is dimension of the set of values of M , d is topological natural to assume that the action potentials has modulation - dimension of norm carrier. M is independent on M , periodic nature, and in equations (9), (10) we can take the therefore, measure defining by (1) can be tentatively called the external field in the form linear value. The dependence of M from A assumes the existence of certain conditions in the form of external F(t) A(1 Bsin(t)) , (11) disturbance, generally it is a order parameter. where is amplitude, coefficient (deeps of modulation), where Sx() is absolute information entropy of an event x frequency modulation of neural oscillations. and S(/) x y is conditional entropy of an event x when For the system consisting of N neurons, we can write the another event is to have occurred. Equation (15) can be following equations in iterative form as y used for solving of technical problems such as for estimation N k ()()()()k k k k V V1 F ( t ) V , (12) of transmission capacity (in communication channels). ii10 k 1 Informational entropy which is Shannon entropy Sx() can be defined as mean value of information as N k ()()()()k k k k V V1 V F ( t ) , (13) ii10 S() x Pi ()() x I i x P i ()ln() x P i x . (16) k 1 ii where k is number of neurons, Vi is action potential of Here, i is number of a cell after segmentation of x . So, let neurons, V is threshold excitation potentials, Ft() – 0 us use (14) as a main definition of information. modulated stimulus value of one neuron, k is the difference According to (16) value of entropy calculated via between fractal and topological dimension of the set of values probability density tends to infinity if x is a continuous value. Let us try to define scale-invariant regularities. So, we have to V . Equation (12) takes into account a possibility for own sub- i use a new approach for the description of information threshold neuron oscillations at Ft( ) 0 , and (13) is used phenomena. Because of this fact we can use information as a only for the description of presence of the stimulus Ft( ) 0 . independent defining variable. Statistical characteristics of a process can be described via information. So, we can try to Ft() can be caused by action potentials of neighboring find new properties of information independent on scale of neurons. measurement. Equations (12) and (13) describe experimentally observed Therefore, according to (14) we shall describe probability of variety of spikes, chaotic oscillations, phase synchronization realization of information PI() as after the bursts [16]. I 3. Informational – entropic characteristics of neural P() I e . (17) signals Probability function fI() can be defined via the following Conception of information is widely used in cybernetics, relations: genetics, sociology, and so on.