Yakov G. Sinai, Laureate 2014

THE ENTROPY OF A each new character provides new information; low Shannon entropy indicates that the next character just confirms something we already Towards the end of the 1950s, the know. Russian Andrey Kol- A dynamical system is a description of a mogorov held a seminar series on dy- physical system and its evolution over time. namical systems at the Moscow Univer- The system has many states and all states sity. A question often raised in the sem- are represented in the state space of the inar concerned the possibility of decid- system. A path in the state space describes ing structural similarity between differ- the dynamics of the dynamical system. ent dynamical systems. A young semi- A dynamical system may be determinis- nar participant, Yakov Sinai, presented tic. In a deterministic system no random- an affirmative answer, introducing the ness is involved in the development of future concept of entropy of a dynamical sys- states of the system. A swinging pendulum tem. describes a deterministic system. Fixing the Let us start by going one decade back in position and the speed, the laws of physics time. In 1948 the American mathematician will determine the motion of the pendulum. Claude E. Shannon published an article en- When throwing a dice, we have the other ex- titled ”A Mathematical Theory of Commu- treme; a stochastic system. The future is nication”. His idea was to use the formal- completely uncertain; the last toss of the dice ism of to describe communica- has no influence on the next. tion as a phenomenon. The purpose of all In general, we can get a good overview communication is to convey a message, but of what happens in a dynamical system in how this is done is the messenger’s choice. the short term. However, when analyzed in Some will express themselves using numerous the long term, dynamical systems are diffi- words or characters; others prefer to be more cult to understand and predict. The prob- brief. The content of the information is the lem of weather forecasting illustrates this phe- same, but the information density may vary. nomenon; the weather condition, described An example is the so-called SMS language. by air pressure, temperature, wind, humid- When sending an SMS message it is common ity, etc. is a state of a dynamical system. A to try to minimize the number of characters. weather forecast for the next ten minutes is The sentence ”I love You” consists of 10 char- much more reliable than a weather forecast acters, while ”I <3 U” consists of only 6, but for the next ten days. the content of the two messages is the same. Yakov Sinai was the first to come up with Shannon introduced the notion of en- a mathematical foundation for quantifying tropy to measure the density of informa- the of a given dynamical system. tion. To what extent does the next character Inspired by Shannon’s entropy in informa- in the message provide us with more infor- tion theory, and in the framework of Kol- mation? High Shannon entropy means that mogorovs Moscow seminar, Sinai introduced the concept of entropy for so-called measure- highly adequate answers to central problems preserving dynamical systems, today known in the classification of dynamical systems. as Kolmogorov-Sinai-entropy. This en- tropy turned out to be a strong and far- Mathematical formalism reaching invariant of dynamical systems. Consider a dynamical system (X, B, µ, T ), The Kolmogorov-Sinai-entropy provides a and let Q = {Q1,...,Qk} be a partition of rich generalization of Shannon entropy. In X into k measurable pair-wise disjoint pieces. information theory a message is an infinite We define the T -pullback of Q as sequence of symbols, corresponding to a state −1 −1 −1 in the framework of dynamical systems. The T Q = {T Q1,...,T Qk} shift operator, switching the sequence one step, gives the dynamics of the system. En- For two given partitions Q = {Q1,...,Qk} tropy measures to what extent we are able to and R = {R1,...,Rm}, we define their refine- predict the next step in the sequence. ment as Another example concerns a container filled Q ∨ R = {Q ∩ R | µ(Q ∩ R ) > 0} with gas. The state space of this physical sys- i j i j tem represents states of the gas, i.e. the po- Combining these two constructions we get the sition and the momentum of every single gas refinement of an iterated pullback; molecule, and the laws of nature determine N the dynamics. Again, the degree of complex- _ T −nQ = {Q ∩ T −1Q ∩ · · · ∩ T −N Q } ity and chaotic behaviour of the gas molecules i0 i1 iN will be the ingredients in the concept of en- n=0 tropy. The entropy of a partition Q is defined as Summing up, the Kolmogorov-Sinai- k entropy measures unpredictability of a X H(Q) = − µ(Q ) log µ(Q ) dynamical system. The higher unpredictabil- m m ity, the higher entropy. This fits nicely with m=1 Shannon entropy, where unpredictability and we put of the next character is equivalent to new N information. It also fits with the concept of 1 _ −n hµ(T,Q) = lim H( T Q) entropy in thermodynamics, where disorder N→∞ N n=0 increases the entropy, and the fact that disorder and unpredictability are closely The Kolmogorov-Sinai-entropy of related. (X, B, µ, T ) is defined as Kolmogorov-Sinai-entropy has strongly in- hµ(T ) = sup hµ(T,Q) fluenced our understanding of the complex- Q ity of dynamical systems. Even though the formal definition is not that complicated, the where the supremum is taken over all finite concept has shown its strength through the measurable partitions.