Complex Social Systems a guided exploration of concepts and methods Information Theory Martin Hilbert (Dr., PhD) Today’s questions I. What are the formal notions of data, information & knowledge? II. What is the role of information theory in complex systems? …the 2nd law and life… III. What is the relation between information and growth? Complex Systems handle information "Although they [complex systems] differ widely in their physical attributes, they resemble one another in the way they handle information. That common feature is perhaps the best starting point for exploring how they operate.” (Gell-Mann, 1995, p. 21) Source: Gell-Mann, M. (1995). The Quark and the Jaguar: Adventures in the Simple and the Complex. New York: St. Martin’s Griffin. Complex Systems Science as computer science “What we are witnessing now is … actually the beginning of an amazing convergence of theoretical physics and theoretical computer science” Chaitin, G. J. (2002). Meta-Mathematics and the Foundations of Mathematics. EATCS Bulletin, 77(June), 167–179. Society computes Numerical: Analytical: Agent-based models Information Theory Why now? Thermodynamics: or ? Aerodynamics: or ? Hydrodynamics: or ? Electricity: or ? etc… Claude Shannon Alan Turing John von Neumann Bell, D. (1973). The Coming of Post- Industrial Society: A Venture in Social Forecasting. New York, NY: Basic Books. “industrial organization of goods” => “industrial organization of information” 15,000 citations 35,000 citations Beniger, J. (1986). The Control Revolution: Castells, M. (1999). The Information Technological and Economic Origins of the Age, Volumes 1-3: Economy, Society Information Society. Harvard University Press and Culture. Cambridge (Mass.); “exploitation of information” Oxford: Wiley-Blackwell. => “control social & economic processes” “society lives in networks” 3,000 citations => “information as well” Shannon, C. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27, 379–423, 623–656. Shannon, C. E., & Weaver, W. (1949). The Mathematical Theory of Communication. University of Illinois Press. U.S. Congress, Office of Technology Assessment. (1995). Global communications : opportunities for trade and aid. (OTA-ITC-642nd ed.). U.S. Government Printing Office. From the Source Coding Theorem to the Channel Coding Theorem Information Communication Claude Shannon (1948) Information Theory Primer: information counts differences Syntactic and Semantic information o In information theory and computer science, the definition of information consists in its quantification (just like the definitions of “heat” (C○;F), “speed” (mph; km/h), or “water” (g,l) are independent from being good, bad or useless for you) o To assign a value to the meaning of information, we first require its quantity: Ralph Hartley [value / unit of information], [US$ / unit of information], or [fitness increase / unit of information], etc. (1888 – 1970) Ralph Hartley & “the difference” o Information consists in the differentiation among alternative possibilities: no information without differences! information does only exist if there are different choices can be understood as the opposite of uncertainty (the more uncertainty we have regarding different possibilities, the less information we have; the more information we have, the less uncertainty) What’s the most basic difference? o Binary: [black,white]; [up,down]; [left,right]; [head,tail]; [42,non-42]; [no-current,current]; [0,1] (!) o How much information is revealed with a binary decision (coin flip)? o To make sense, each additional unit should provide as much information as the previous: additive measure = each additional unit gives the number of choices at this additional step Source: Hartley, R. V. L. (1928). Transmission of Information. Bell System Technical Journal, International Congress of Telegraphy and Telephony, Lake Como, Italy, 1927, 535–563 Information Theory Primer: finding the right measure How many binary symbols (coin flips) are needed to describe 8 differences? ? 1st coin flip Ralph Hartley (1888 – 1970) 2nd coin flip [h,h] [h,t] [t,h] [t,t] 3rd coin flip [111] [110] [101] [100] [011] [010] [001] [000] 4th coin flip…? Answer: …through the “2-ary uncertainty” revealed at each step: 1 flip o 2 choices = 1 symbol (coin flip) o 2 = 2 choices o 4 choices = 2 symbols o 22 flips = 4 choices 2 ∗ 2 ∗ 2 = 23 = 8 o 8 choices = 3 symbols o 23 flips = 8 choices 푙표푔2 2 ∗ 2 ∗ 2 = 푙표푔28 = 3 o 16 choices = 4 symbols o 24flips = 16 choices Source: Hartley, R. V. L. (1928). Transmission of Information. Bell System Technical Journal, International Congress of Telegraphy and Telephony, Lake Como, Italy, 1927, 535–563 Information Theory Primer: base of log defines metric system How many senary (6-ary) symbols are needed to describe 36 choices? ? 1st roll of dice [1,3] [2,4] [4,4] 2nd roll of dice Base of logarithm defines metric system, just like: - feet vs. meters - liters vs. gallons 6-ary code: o 6 choices = 1 symbol (dice rolls) o 61 roll = 6 choices o 36 choices = 2 symbols o 62 rolls = 36 choices 6 ∗ 6 ∗ 6 = 63 = 216 o 216 choices = 3 symbols o 63 rolls = 216 choices 푙표푔6 6 ∗ 6 ∗ 6 = 푙표푔6216 = 3 Source: Hartley, R. V. L. (1928). Transmission of Information. Bell System Technical Journal, International Congress of Telegraphy and Telephony, Lake Como, Italy, 1927, 535–563 Information Theory Primer: Shannon’s (1948) idea of the bit Shannon’s game of twenty questions: 220 = 1,048,576 => …down to 5 miles2 ! Claude Shannon (1916 – 2001) Claude E. Shannon (1948) A Mathematical Theory of Communication, Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656. A Information Theory Primer B C coding Yes D E No F G H Yes I J K No Yes L No M N Claude Shannon Yes O (1948) P A Mathematical 2 3 4 5 J Theory of 1 R bits S Communication, Yes Bell System Technical Journal, T Vol. 27, pp. 379–423, 623–656. No No U V Yes W genius: X No Y Transmit: Z Ñ “g” Yes Á No É Í Ó => “g” = yes-yes-no-no-yes = 11001 Ú A Information Theory Primer B C + coding Yes D P E R No F O G H B Yes I A J B K I No Yes L L No M Yes N I Claude Shannon O T (1948) P Y A Mathematical J 1 2 3 4 5 bits– Theory of R S D Communication, No Yes Bell System Technical Journal, T I Vol. 27, pp. 379–423, 623–656. No U S V T Yes W genius: X R No Y I Transmit: Z B Ñ U Yes Á “i” T No É Í I Ó O => “i” = yes-no-yes-yes-yes = 10111 Ú N => consider REDUNDANCY A Information Theory Primer B C + Yes D P E R No F O G H B Yes I A J B K I No Yes L L No M Yes N I O T P Y 1 2 3 4 5 Jbits– R S D Yes No T I Source Coding No U S V T Theorem: Yes W genius: X R What’s the purest No Y I form Transmit: Z B Ñ U of information? Yes Á “i” T ! ENTROPY ! No É Í I Ó O => “i” = yes-no-yes-yes-yes = 10111 Ú N => consider REDUNDANCY COMPRESSION Transmit less data, so you can transmit more information “I do not ne_d to comm_______ (…with the same amount of ev_ry det_il, _u kn_w what I data symbols…) mean, and _u rec_ive all the info____ anywa_s, right? Source Coding Probability of letters Probability of words Theorem: in English alphabet: in English language: What’s the purest form E: 13% “the”: 10% T: 10% “ of”: 5.1% of information? A: 8% “in”: 2.7% ! ENTROPY ! O: 7% … … “with”: 0.66% Q: 0.121% “from”: 0.64% Z: 0.077% ... COMPRESSION Transmit less, so you can transmit more! SourceL Purandare, A. (2008). JPEG vs JPEG2000 Comparison P.S. there is also “lossy compression”, but this is simply elimination of some information (reducing quality / detail / information). Real “data compression” into information is “loss-less”. Source Coding Theorem: What’s the purest form of information? ! ENTROPY ! 1993! telecom number of performance of sum up their product telecom devices devices Infrastructure Hardware 7% per year 28% 8% per year per year X Software = compression 10.4% per year Number of X X = (drops) bits storage number of performance of sum up their product storage devices devices Infrastructure Hardware 8% per year 25% 5% per year per year X Software = compression 11% per year Number of X X = (stones) bits The 2nd law of thermodynamics: how structure and information relate “The law that entropy always increases holds, I think, the supreme position among the laws of Nature” (Eddington, 1927) => only 1 way to => 20 ways to => 6! = 720 ways create this structure create this structure to create this structure . ? ? ? 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 t1 no uncertainty t2 max. uncertainty t3 = min. entropy = max. entropy Things decay as time passes = Things become more uniformly distributed as everything interacts with everything = as time goes by it is most likely (almost certain) that the most likely thing will happen… Sources: Eddington, A. S. (1927). The Nature of the Physical World. Kessinger Publishing. As time goes by entropy increases… …BUT what about Isaac Asimov’s “Last Question” ever asked: "Can entropy ever be reversed?“ Life reverses entropy all the time! Erwin Schrödinger Sources: Asimov, Isaac. The Last Question. Science Fiction Quarterly. November 1956 (1887-1961) 3/4 of 1/4 of the How does evolution the time time “communicate”? 50 % 50 % 33 % 67 % 50 % 50 % 33 % 67 % 20 % 80 % 3/4 of 1/4 of the How does evolution the time time “communicate”? 20 % 80 % 11 % 89 % 20 % 80 % 11 % 89 % 6 % 94 % How does evolution “communicate”? t1 => t2 => … => tn 25 % of the time … 75 % of the time The eco-”system” communicates an environmental structure to the system, which stores this information in its own structure A How much information B C does an evolutionary Yes D E dynamic communicate? No F G H Yes I J No K Yes L Yes No M N O P .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages64 Page
-
File Size-