Noise Versus Chaos in a Causal Fisher-Shannon Plane
Total Page:16
File Type:pdf, Size:1020Kb
Papers in Physics, vol. 7, art. 070006 (2015) www.papersinphysics.org Received: 20 November 2014, Accepted: 1 April 2015 Edited by: C. A. Condat, G. J. Sibona Licence: Creative Commons Attribution 3.0 DOI: http://dx.doi.org/10.4279/PIP.070006 ISSN 1852-4249 Noise versus chaos in a causal Fisher-Shannon plane Osvaldo A. Rosso,1, 2∗ Felipe Olivares,3 Angelo Plastino4 We revisit the Fisher-Shannon representation plane H × F, evaluated using the Bandt and Pompe recipe to assign a probability distribution to a time series. Several stochastic dynamical (noises with f −k, k ≥ 0, power spectrum) and chaotic processes (27 chaotic maps) are analyzed so as to illustrate the approach. Our main achievement is uncovering the informational properties of the planar location. I. Introduction quantifiers. Chaotic systems display \sensitivity to ini- Temporal sequences of measurements (or observa- tial conditions" and lead to non-periodic motion tions), that is, time-series (TS), are the basic ele- (chaotic time series). Long-term unpredictability ments for investigating natural phenomena. From arises despite the deterministic character of the tra- TS, one should judiciously extract information on jectories (two neighboring points in the phase space dynamical systems. Those TS arising from chaotic move away exponentially rapidly). Let x (t) and systems share with those generated by stochastic 1 x2(t) be two such points, located within a ball of processes several properties that make them very radius R at time t. Further, assume that these two similar: (1) a wide-band power spectrum (PS), (2) points cannot be resolved within the ball due to a delta-like autocorrelation function, (3) irregular poor instrumental resolution. At some later time t0, behavior of the measured signals, etc. Now, irregu- the distance between the points will typically grow lar and apparently unpredictable behavior is often 0 0 0 to jx1(t ) − x2(t )j ≈ jx1(t) − x2(t)j exp(λ jt − tj), observed in natural TS, which makes interesting the with λ > 0 for a chaotic dynamics, λ the largest establishment of whether the underlying dynami- Lyapunov exponent. When this distance at time t0 cal process is of either deterministic or stochastic exceeds R, the points become experimentally dis- character in order to i) model the associated phe- tinguishable. This implies that instability reveals nomenon and ii) determine which are the relevant some information about the phase space popula- ∗Email: [email protected] tion that was not available at earlier times [1]. One can then think of chaos as an information source. 1 Insitituto Tecnol´ogicode Buenos Aires, Av. Eduardo The associated rate of generated information can be Madero 399, C1106ACD Ciudad Aut´onomade Buenos cast in precise fashion via the Kolmogorov-Sinai's Aires, Argentina. entropy [2, 3]. 2 Instituto de F´ısica, Universidade Federal de Alagoas, Macei´o,Alagoas, Brazil. One question often emerges: is the system chaotic (low-dimensional deterministic) or stochas- 3 Departamento de F´ısica,Facultad de Ciencias Exactas, Universidad Nacional de La Plata, La Plata, Argentina. tic? If one is able to show that the system is domi- nated by low-dimensional deterministic chaos, then 4 Instituto de F´ısica,IFLP-CCT, Universidad Nacional de La Plata, La Plata, Argentina. only few (nonlinear and collective) modes are re- quired to describe the pertinent dynamics [4]. If 070006-1 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. not, then the complex behavior could be modeled a measure of \global character" that is not too by a system dominated by a very large number of sensitive to strong changes in the distribution tak- excited modes which are in general better described ing place on a small-sized region. Such is not the by stochastic or statistical approaches. case with Fisher's Information Measure (FIM) F Several methodologies for evaluation of Lya- [16,17], which constitutes a measure of the gradient punov exponents and Kolmogorov-Sinai entropies content of the distribution f(x), thus being quite for time-series' analysis have been proposed (see sensitive even to tiny localized perturbations. It Ref. [5]), but their applicability involves taking reads into account constraints (stationarity, time series length, parameters values election for the method- Z 1 df(x)2 ology, etc.) which in general make the ensuing F[f] = dx ∆ f(x) dx results non-conclusive. Thus, one wishes for new Z d (x)2 tools able to distinguish chaos (determinism) from = 4 : (2) noise (stochastic) and this leads to our present in- ∆ dx terest in the computation of quantifiers based on Information Theory, for instance, \entropy", \sta- FIM can be variously interpreted as a measure of tistical complexity", \Fisher information", etc. the ability to estimate a parameter, as the amount These quantifiers can be used to detect deter- of information that can be extracted from a set of minism in time series [6{11]. Different Informa- measurements, and also as a measure of the state tion Theory based measures (normalized Shannon of disorder of a system or phenomenon [17]. In the entropy, statistical complexity, Fisher information) previous definition of FIM (Eq. (2)), the division allow for a better distinction between deterministic by f(x) is not convenient if f(x) ! 0 at certain chaotic and stochastic dynamics whenever \causal" x−values. We avoid this if we work with real prob- 2 information is incorporated via the Bandt and ability amplitudes f(x) = (x) [16,17], which is a Pompe's (BP) methodology [12]. For a review of simpler form (no divisors) and shows that F simply BP's methodology and its applications to physics, measures the gradient content in (x). The gradi- biomedical and econophysic signals, see [13]. ent operator significantly influences the contribu- Here we revisit, for the purposes previously de- tion of minute local f−variations to FIM's value. tailed, the so-called causality Fisher{Shannon en- Accordingly, this quantifier is called a \local" one tropy plane, H × F [14], which allows to quantify [17]. the global versus local characteristic of the time Let now P = fpi; i = 1; ··· ;Ng be a discrete series generated by the dynamical process under probability distribution, with N the number of pos- study. The two functionals H and F are evalu- sible states of the system under study. The con- ated using the Bandt and Pompe permutation ap- comitant problem of information-loss due to dis- proach. Several stochastic dynamics (noises with cretization has been thoroughly studied and, in par- f −k, k ≥ 0, power spectrum) and chaotic processes ticular, it entails the loss of FIM's shift-invariance, (27 chaotic maps) are analyzed so as to illustrate which is of no importance for our present purposes the methodology. We will encounter that signifi- [10, 11]. In the discrete case, we define a \normal- cant information is provided by the planar location. ized" Shannon entropy as ( N ) S[P ] 1 X II. Shannon entropy and Fisher in- H[P ] = = − pi ln(pi) ; (3) S S formation measure max max i=1 Given a continuous probability distribution func- where the denominator Smax = S[Pe] = ln N is R that attained by a uniform probability distribution tion (PDF) f(x) with x 2 ∆ ⊂ R and ∆ f(x) dx = 1, its associated Shannon Entropy S [15] is Pe = fpi = 1=N; 8i = 1; ··· ;Ng. For the FIM, we take the expression in term of real probability Z amplitudes as starting point, then a discrete nor- S[f] = − f ln(f) dx ; (1) malized FIM convenient for our present purposes is ∆ 070006-2 Papers in Physics, vol. 7, art. 070006 (2015) / O. A. Rosso et al. given by a) Noninvertible maps: (1) Logistic map; (2) Sine N−1 map; (3) Tent map; (4) Linear congruential X 1=2 1=2 2 F[P ] = F0 [(pi+1) − (pi) ] : (4) generator; (5) Cubic map; (6) Ricker's popu- i=1 lation model; (7) Gauss map; (8) Cusp map; It has been extensively discussed that this dis- (9) Pinchers map; (10) Spence map; (11) Sine- cretization is the best behaved in a discrete envi- circle map; ronment [18]. Here, the normalization constant F0 b) Dissipative maps: (12) H´enonmap; (13) Lozi reads map; (14) Delayed logistic map; (15) Tinker- 8 ∗ < 1 if pi∗ = 1 for i = 1 or bell map; (16) Burgers' map; (17) Holmes ∗ ∗ F0 = i = N and pi = 0 8i 6= i (5) cubic map; (18) Dissipative standard map; : 1=2 otherwise: (19) Ikeda map; (20) Sinai map; (21) Discrete If our system lies in a very ordered state, which predator-prey map, occurs when almost all the pi { values are zeros, c) Conservative maps: (22) Chirikov standard we have a normalized Shannon entropy H ∼ 0 and map; (23) H´enon area-preserving quadratic a normalized Fisher's Information Measure F ∼ 1. map; (24) Arnold's cat map; (25) Gingerbread- On the other hand, when the system under study is man map; (26) Chaotic web map; (27) Lorenz represented by a very disordered state, that is when three-dimensional chaotic map; all the pi { values oscillate around the same value, we obtain H ∼ 1 while F ∼ 0. One can state that Even when the present list of chaotic maps is not the general FIM-behavior of the present discrete exhaustive, it could be taken as representative of version (Eq. (4)), is opposite to that of the Shan- common chaotic systems [19]. non entropy, except for periodic motions [10, 11]. The local sensitivity of FIM for discrete-PDFs is re- ii. Noises with f −k power spectrum flected in the fact that the specific \i−ordering" of The corresponding time series are generated as fol- the discrete values pi must be seriously taken into account in evaluating the sum in Eq.