ORBIT COMPLEXITY and DATA COMPRESSION Stefano Galatolo

ORBIT COMPLEXITY and DATA COMPRESSION Stefano Galatolo

DISCRETE AND CONTINUOUS Website: http://math.smsu.edu/journal DYNAMICAL SYSTEMS Volume 7, Number 3, July 2001 pp. 477–486 ORBIT COMPLEXITY AND DATA COMPRESSION Stefano Galatolo Dipartimento di Matematica Universit`adi Pisa Via Buonarroti, 2, 56127 Pisa, Italy (Communicated by Konstantin Mischaikow) Abstract. We consider data compression algorithms as a tool to get an approximate measure for the quantity of information contained in a string. By this it is possible to give a notion of orbit complexity for topological dynamical systems. In compact ergodic dynamical systems, entropy is almost everywhere equal to orbit complexity. The use of compression algorithms allows a direct estimation of the information content of the orbits. 1. Introduction. In [6] Brudno gives a definition of orbit complexity for a topo- logical dynamical system. The complexity of an orbit is a measure of the amount of information that is necessary to describe the orbit. In the literature many relations are proved between orbit complexity and other measures of the complexity and of the chaotic behavior of a dynamical system. One of the most important ([6]) states that in an ergodic,compact dynamical system the complexity of almost each orbit is equal to the metric entropy of the system. Brudno’s construction translates the orbit into a set of strings whose complexity defines the complexity of the orbit. The complexity of those strings is defined by the tools of algorithmic information theory (Kolmogorov complexity)(see e.g. [8][18]). Unfortunately, the Kolmogorov complexity of a string is not a computable function and for this reason Brudno’s orbit complexity is not in general directly computable and useful on concrete ap- plications. One could think for example to have an estimation of the entropy of an unknown system by an estimation of the complexity of some orbit (whose be- havior is given by an experimental observation or a computer simulation), this is encouraged by the above relation between orbit complexity and entropy but is not possible by the uncomputability of Kolmogorov complexity. In this paper we use data compression algorithms to give a definition of orbit complexity which has the same relation with entropy as Brudno’s one and it is computable. We consider an universal coding procedure E and we define the E-orbit complexity by replacing Kolmogorov complexity of a string with the length of the string after it was compressed by E, which we consider as an approximate estimation of the information contained in the string. The E-complexity of the orbit of a point is invariant under isomorphisms of dynamical systems and is defined independently of the choice of an invariant measure or of the knowledge of other global features of the system under consideration. For this reason we think that our definition of orbit complexity can give an estimation of the complexity of a dynamical system from 1991 Mathematics Subject Classification. 28D20, 58F13, 68P30. Key words and phrases. Topological dynamical systems, information, coding, entropy, orbit complexity. 477 478 STEFANO GALATOLO its behavior. Moreover orbit complexity can give informations on the complexity of the dynamics even in the cases when other traditional measures of complexity (entropy, Lyapunov exponents...) are not defined or trivial (see e.g. [3] or [11]). The main result we prove is the analogous of the Brudno’s relation between entropy and orbit complexity: in an ergodic, compact dynamical system the E-complexity of almost each orbit is equal to the metric entropy of the system. In Section 2 we give some notion about algorithmic information theory and the definition of orbit complexity given by Brudno [6] , with related results. In Section 3 we present some essential information about universal coding al- gorithms. A universal coding algorithm is an algorithm for which the coding of a string depends only on the string itself and not on the statistical properties of the information source which outputs the string. We give the definition of ideal coding schemes: a class of coding procedures in which the classical Lempel-Ziv 1978, LZ78 [13] coding scheme is included. Each ideal coding scheme will give a definition of orbit complexity for which the main theorems holds. In Section 4 we give the definition of E orbit complexity for dynamical systems. In Section 5 we prove the main results, linking orbit complexity to entropy. Theorem 21 is analogous to the main theorem of [6] (Theorem3.1). 2. Algorithmic Information Theory and orbit complexity. We give the ba- sic definitions of the concepts coming from algorithmic information theory that will be used in the following sections, a detailed exposition can be found in [8] or [18]. Let Σ = {0, 1}∗ be the set of finite (possibly empty) binary strings. Let us consider a Turing machine C. By the notation C(p) = s we mean that C starting with input p (the program ) stops with output s (C defines a recursive function C :Σ → Σ). If the input gives a never ending computation the output (the value of the recursive function) is not defined. If some input gives a never ending computation (and the function C will be not defined for all p ∈ Σ ) we say that C defines a partial recursive function C :Σ → Σ. If the computation performed by C stops for each input then we say that C defines a total recursive function C :Σ → Σ. The following definition due to Kolmogorov and Chaitin is the basis of the algo- rithmic information theory. Definition 1. The Kolmogorov complexity or algorithmic information content of a string s given C is the length of the smallest program p giving s as the output: KC (s) = min |p| C(p)=s if s is not a possible output for the computer C then KC (s) = ∞ . Definition 2. A Turing machine U is said to be asymptotically optimal if for each Turing machine F and each binary string s we have KU (s) ≤ KF (s) + cU,F where the constant c depends on F and not on s. It can be proved that an asymptotically optimal Turing machine exists. If we chose an asymptotically optimal Turing machine the complexity of a string will be defined independently of the given Turing machine up to a constant. In the KU (s) next sections we consider the value of fractions like |s| , when the length of s goes to infinity the value of the constant will become irrelevant. For the rest of the paper we will suppose that an asymptotically optimal Turing machine U is chosen once forever. ORBIT COMPLEXITY AND DATA COMPRESSION 479 2.1. Brudno’s definition of orbit complexity. Here we sketch Brudno’s defi- nition of orbit complexity. For a more detailed introduction see [6] or [15]. As it was stated in the introduction, the orbit is translated into a set of strings. Let us consider a topological dynamical system (X, T ). X is a metric space and T is a continuous onto mapping X → X. Let us consider a finite cover β = {B0,B1, ..., BN−1} of X. The sets Bi are measurable sets whose union is X and may have non empty intersections. We use this cover to code the orbits of (X, T ) into a set of infinite strings. If x ∈ X let us define the set of symbolic orbits of x with respect to β as: N n ϕβ(x) = {ω ∈ {0, 1, ..., N − 1} : ∀n ∈ N,T (x) ∈ Bω(n)}. The set ϕβ(x) is the set of all the possible codings of the orbit of x relative to the cover β. Many codings are indeed possible since the sets may have non empty intersection. The complexity of the orbit of x ∈ X relative to β is defined as: K (ωn) K(x, T |β) =limsup min U n→∞ ω∈ϕβ (x) n where ωn is the string containing the first n digits of ω. We remark that ωn is not a binary string. It is easy to imagine how the definition of Kolmogorov complexity can be extended to strings made of digits coming from a finite alphabet. This was the definition of orbit complexity with respect to a general measurable cover, this definition includes both the two interesting cases of measurable covers: measurable partitions and open covers. Open covers are important to give a mean- ingful definition of orbit complexity which is independent of a choice of a given cover. Taking the supremum over the set of all finite open covers β of the metric space X it is possible to define the complexity of the orbit of x: K(x, T ) =sup (supK(x, T |β)) β This definition associates to a point belonging to X a real numbers which is a measure of the complexity of the orbit of x. For example, if a point is periodic or its orbit converges to some fixed point then orbit complexity is 0. We remark that it is important to suppose that sets in the covers are open; if we allow for non-open covers there are dynamical systems with points having high orbit complexity while the orbit converges to a point (that is on the boundary of more than one set of the cover). Orbit complexity is invariant under topological conjugation: Theorem 3. ([6]) If the dynamical systems (X, T ) and (Y, S) are topologically conjugate, and π : X → Y is the conjugating homeomorphism, and π(x) = y then supK(x, T ) = supK(y, S). In the literature (see e.g. [10], [6], [15], [4], [11]) many relations have been proved between orbit complexity and other forms of complexity of a dynamical system (Kolmogorov entropy, topological entropy and others) and with other problems concerning orbits of a dynamical system.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us