March 2012 Legg Mason Capital Management Shaking the Foundation

Total Page:16

File Type:pdf, Size:1020Kb

March 2012 Legg Mason Capital Management Shaking the Foundation Perspectives March 2012 Legg Mason Capital Management Shaking the Foundation Past performance is no guarantee of future results. All investments involve risk, including possible loss of principal. This material is not for public distribution outside the United States of America. Please refer to the disclosure information on the final page. In the unIted states – InVestMent PROduCts: nOt FdIC InsuRed • nO BanK GuaRantee • MaY LOse VaLue Batterymarch I Brandywine Global I ClearBridge Advisors I Legg Mason Capital Management Legg Mason Global Asset Allocation I Legg Mason Global Equities Group I Permal Royce & Associates I Western Asset Management March 23, 2012 Shaking the Foundation Revisiting Basic Assumptions about Risk, Reward, and Optimal Portfolios An Interview with Ole Peters Investing is fundamentally a bet on the future. The problem is that we only get to live once: we have to make choices in the face of uncertainty and deal with the consequences for better or worse. The mean-variance approach, the most common guide to build a portfolio, basically says that you should get the most return for a given level of risk based on the expected values of the individual assets and on how they correlate with one another. One of the great aspects of an affiliation with the Santa Fe Institute (SFI) is the opportunity to exchange ideas with world class scientists who are self-selected to be curious about the world. Ole Peters, trained as a physicist and a visiting researcher at SFI, is a great example. Ole has been discussing his concerns about the standard theories in economics for years, including portfolio theory, and he gave a talk last fall that prodded me to share his thoughts with a wider audience. At first, I considered writing about this myself. But then I figured it would be better to interview Ole and let you hear the story directly from him. Fair warning: this is not easy material. But I believe working through these ideas and their implications is time well spent. Here’s a brief summary: • Distinction between ergodic and non-ergodic systems. Ole starts us off with this crucial distinction. An ergodic system is one where the ensemble average and the time average are the same. For example, the proportion of heads and tails is the same either if you ask 1,000 people to flip a coin at the same time (ensemble average because an ensemble of people are flipping simultaneously) or if you flip it yourself 1,000 times (time average because it takes time to flip sequentially). His point is that many of the models that economists use were designed to deal with ergodic systems, yet the reality is that we live in a world that is non-ergodic. That mismatch is problematic for portfolio construction. • How theory evolves is important. Over the last 150 years, economists have borrowed ideas from physics—generally with the goal of making economics into a harder science. In the late 1800s, Ludwig Boltzmann described ergodicity, but recognized that it worked under very narrow conditions. Economists embraced the “ergodic hypothesis” and applied it to systems that are not ergodic. The stock market is a good example. • Optimal leverage. In a mean-variance approach, risk and reward are related to one another in a linear fashion: more risk, more potential reward. And the main way to increase risk is to add leverage through debt. The key is that the ensemble average gets “rid of fluctuations before the fluctuations really have any effect.” If you assume a time average and a multiplicative process—that is, you parlay your bets—then there is an optimal amount of leverage. Too little leverage leaves potential profit on the table, but too much leverage assures ruin. Page 1 Legg Mason Capital Management MM: Ole, thanks for taking the time to share your thoughts with us. In many fields, there are foundational assumptions that, once established, are often not discussed or critically examined. Because these foundations are laid by the intellectual leaders in the field, they tend to go unquestioned. You have been doing some fascinating work that challenges some of the foundational assumptions in economics and finance. One such assumption is called the "ergodic hypothesis." Can you explain the difference between an ergodic and a non-ergodic system and tell us why this is so relevant in economics and finance? OP: Thank you, Michael. It's a pleasure to discuss this with you. Ergodicity is this complicated-sounding word, and a lot of highly technical work has been done in ergodic theory, but as usual the real meat is conceptual. The conceptual part is deep and subtle and utterly fascinating, but it's not actually very hard to understand. That doesn’t mean we can leave out the mathematics. If you want to use the concept, you have to understand the mathematics—there’s “no royal road”, as Luca Pacioli put it. Without the mathematics, you can’t get beyond what I call an “incomplete treacherous intuition” of the meaning. But since this is an interview, not a mathematics lecture, I will try to convey that incomplete treacherous intuition. Here it is: In an ergodic system time is irrelevant and has no direction. Nothing changes in any significant way; at most you will see some short-lived fluctuations. An ergodic system is indifferent to its initial conditions: if you re-start it, after a little while it always falls into the same equilibrium behavior. For example, say I gave 1,000 people one die each, had them roll their die once, added all the points rolled, and divided by 1,000. That would be a finite-sample average, approaching the ensemble average as I include more and more people. Now say I rolled a die 1,000 times in a row, added all the points rolled and divided by 1,000. That would be a finite-time average, approaching the time average as I keep rolling that die. One implication of ergodicity is that ensemble averages will be the same as time averages. In the first case, it is the size of the sample that eventually removes the randomness from the system. In the second case, it is the time that I’m devoting to rolling that removes randomness. But both methods give the same answer, within errors. In this sense, rolling dice is an ergodic system. I say “in this sense” because if we bet on the results of rolling a die, wealth does not follow an ergodic process under typical betting rules. If I go bankrupt, I’ll stay bankrupt. So the time average of my wealth will approach zero as time passes, even though the ensemble average of my wealth may increase. A precondition for ergodicity is stationarity, so there can be no growth in an ergodic system. Ergodic systems are zero-sum games: things slosh around from here to there and back, but nothing is ever added, invented, created or lost. No branching occurs in an ergodic system, no decision has any consequences because sooner or later we'll end up in the same situation again and can reconsider. The key is that most systems of interest to us, including finance, are non- ergodic. It's probably best to step back and start at the beginning. The word "ergodicity" was coined during the development of statistical mechanics. Ludwig Boltzmann, an Austrian physicist, invented it. He called ergodic systems "monodic" at first. Here’s the etymology: a "monodic" system is one that possesses only one (mon-) path (-odos) through the space of its possible states. The word became "ergodic" because Boltzmann considered systems of fixed energy (or work = ergon), and the idea was that the one path covers all the states allowed by that energy (the energy shell). Page 2 Legg Mason Capital Management If this is the case, Boltzmann argued that the system will over time—in his case a second may be long enough—visit all the relevant states that it can access. The system will visit each state with some relative frequency. We can mathematically treat those frequencies as probabilities, which has the incredibly nice consequence that the long-time averages of the quantities we're interested in are formally the same as expectation values arising from the relative frequencies that we interpreted as probabilities. Just like in the dice example. This makes the mathematics extremely simple. But it is a trick (Boltzmann literally called it a trick): we calculate expectation values that are a priori 1 irrelevant. In this very special case of an ergodic system, expectation values are the same as time averages, and that's why we're interested in these special cases. Time averages are interesting because they are what we observe in physics. We usually measure some sort of macroscopic property, like the pressure in a balloon. That pressure, in terms of the microscopic model, is the rate of momentum transfer per unit area to the balloon membrane resulting from a gazillion collisions of molecules with the membrane. Any device that we use to measure this is so sluggish that it will only give us a long-time average value of that momentum transfer. The very clever insight of Boltzmann was that under very special conditions, we just have to calculate an expectation value of the rate of momentum transfer per area, and that will coincide with the time average pressure that we actually observe. So, practically, ergodic means that time averages are the same as ensemble averages, or expectation values. Non-ergodicity means that they are different. Since there are many more ways of being different than there are of being identical, it comes as no surprise that most systems are non-ergodic.
Recommended publications
  • Life with Augustine
    Life with Augustine ...a course in his spirit and guidance for daily living By Edmond A. Maher ii Life with Augustine © 2002 Augustinian Press Australia Sydney, Australia. Acknowledgements: The author wishes to acknowledge and thank the following people: ► the Augustinian Province of Our Mother of Good Counsel, Australia, for support- ing this project, with special mention of Pat Fahey osa, Kevin Burman osa, Pat Codd osa and Peter Jones osa ► Laurence Mooney osa for assistance in editing ► Michael Morahan osa for formatting this 2nd Edition ► John Coles, Peter Gagan, Dr. Frank McGrath fms (Brisbane CEO), Benet Fonck ofm, Peter Keogh sfo for sharing their vast experience in adult education ► John Rotelle osa, for granting us permission to use his English translation of Tarcisius van Bavel’s work Augustine (full bibliography within) and for his scholarly advice Megan Atkins for her formatting suggestions in the 1st Edition, that have carried over into this the 2nd ► those generous people who have completed the 1st Edition and suggested valuable improvements, especially Kath Neehouse and friends at Villanova College, Brisbane Foreword 1 Dear Participant Saint Augustine of Hippo is a figure in our history who has appealed to the curiosity and imagination of many generations. He is well known for being both sinner and saint, for being a bishop yet also a fellow pilgrim on the journey to God. One of the most popular and attractive persons across many centuries, his influence on the church has continued to our current day. He is also renowned for his influ- ence in philosophy and psychology and even (in an indirect way) art, music and architecture.
    [Show full text]
  • The Evolution of Technical Analysis Lo “A Movement Is Over When the News Is Out,” So Goes Photo: MIT the Evolution the Wall Street Maxim
    Hasanhodzic $29.95 USA / $35.95 CAN PRAISE FOR The Evolution of Technical Analysis Lo “A movement is over when the news is out,” so goes Photo: MIT Photo: The Evolution the Wall Street maxim. For thousands of years, tech- ANDREW W. LO is the Harris “Where there is a price, there is a market, then analysis, and ultimately a study of the analyses. You don’t nical analysis—marred with common misconcep- & Harris Group Professor of Finance want to enter this circle without a copy of this book to guide you through the bazaar and fl ash.” at MIT Sloan School of Management tions likening it to gambling or magic and dismissed —Dean LeBaron, founder and former chairman of Batterymarch Financial Management, Inc. and the director of MIT’s Laboratory by many as “voodoo fi nance”—has sought methods FINANCIAL PREDICTION of Technical Analysis for Financial Engineering. He has for spotting trends in what the market’s done and “The urge to fi nd order in the chaos of market prices is as old as civilization itself. This excellent volume published numerous papers in leading academic what it’s going to do. After all, if you don’t learn from traces the development of the tools and insights of technical analysis over the entire span of human history; FINANCIAL PREDICTION FROM BABYLONIAN history, how can you profi t from it? and practitioner journals, and his books include beginning with the commodity price and astronomical charts of Mesopotamia, through the Dow Theory The Econometrics of Financial Markets, A Non- of the early twentieth century—which forecast the Crash of 1929—to the analysis of the high-speed TABLETS TO BLOOMBERG TERMINALS Random Walk Down Wall Street, and Hedge Funds: electronic marketplace of today.
    [Show full text]
  • Leibniz on China and Christianity: the Reformation of Religion and European Ethics Through Converting China to Christianity
    Bard College Bard Digital Commons Senior Projects Spring 2016 Bard Undergraduate Senior Projects Spring 2016 Leibniz on China and Christianity: The Reformation of Religion and European Ethics through Converting China to Christianity Ela Megan Kaplan Bard College, [email protected] Follow this and additional works at: https://digitalcommons.bard.edu/senproj_s2016 Part of the European History Commons This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License. Recommended Citation Kaplan, Ela Megan, "Leibniz on China and Christianity: The Reformation of Religion and European Ethics through Converting China to Christianity" (2016). Senior Projects Spring 2016. 279. https://digitalcommons.bard.edu/senproj_s2016/279 This Open Access work is protected by copyright and/or related rights. It has been provided to you by Bard College's Stevenson Library with permission from the rights-holder(s). You are free to use this work in any way that is permitted by the copyright and related rights. For other uses you need to obtain permission from the rights- holder(s) directly, unless additional rights are indicated by a Creative Commons license in the record and/or on the work itself. For more information, please contact [email protected]. Leibniz on China and Christianity: The Reformation of Religion and European Ethics through Converting China to Christianity Senior Project submitted to The Division of Social Studies Of Bard College by Ela Megan Kaplan Annandale-on-Hudson, New York May 2016 5 Acknowledgements I would like to thank my mother, father and omniscient advisor for tolerating me for the duration of my senior project.
    [Show full text]
  • The Metaphysical Foundations Underlying Augustine's Solution to the Problem of Evil
    Georgia State University ScholarWorks @ Georgia State University Philosophy Theses Department of Philosophy 11-30-2007 Between Being and Nothingness: The Metaphysical Foundations Underlying Augustine's Solution to the Problem of Evil Brian Keith Kooy Follow this and additional works at: https://scholarworks.gsu.edu/philosophy_theses Part of the Philosophy Commons Recommended Citation Kooy, Brian Keith, "Between Being and Nothingness: The Metaphysical Foundations Underlying Augustine's Solution to the Problem of Evil." Thesis, Georgia State University, 2007. https://scholarworks.gsu.edu/philosophy_theses/32 This Thesis is brought to you for free and open access by the Department of Philosophy at ScholarWorks @ Georgia State University. It has been accepted for inclusion in Philosophy Theses by an authorized administrator of ScholarWorks @ Georgia State University. For more information, please contact [email protected]. BETWEEN BEING AND NOTHINGNESS: THE METAPHYSICAL FOUNDATIONS UNDERLYING AUGUSTINE’S SOLUTION TO THE PROBLEM OF EVIL by BRIAN KEITH KOOY Under the Direction of Dr. Timothy M. Renick ABSTRACT Several commentators make the claim that Augustine is not a systematic thinker. The purpose of this thesis is to refute that claim in one specific area of Augustine's thought, the metaphysical foundations underlying his solutions to the problem of evil. Through an exegetical examination of various works in which Augustine writes on evil, I show that his solutions for both natural and moral evil rely on a coherent metaphysical system,
    [Show full text]
  • Title : Random Number Generator: Testing and Whitening. Co
    Title : Random number generator: testing and whitening. Co-Encadrants : Andrei ROMASHCHENKO and Alexander SHEN (LIRMM) contact for more detail : [email protected], [email protected] Keywords : random number generators, statistical tests Prerequisites : The candidate should have programming skills and some knowledge in probability theory. Abstract : Generation of random bits is a classical problem known in the context of pseudo-random generators and also in connection with of truly ran- dom physical processes (there exist electronic devices that produce random bits using an unpredictable physical noise or intrinsically nondeterministic quantum phenomena). However, the quality of physical generators of random bits remains badly founded and poorly tested. The first objective of this pro- ject is an experimental study of the validity and quality of several physical random numbers generators. When we talk about the quality of random or pseudo-random genera- tors, we have to use randomness tests. The second objective of the project is an inventory and revision of statistical tests for random and pseudo-random generators. We suggest to improve the quality of statistical tests and de- velop new techniques of “whitening” that improves the quality of non-ideal sources of random bits. Another axis of the project is a conversion of various probabilistic proofs into unconventional randomness tests. Some more detail : Randomness (in a form of sequences of random bits, random numbers, and so on) is widely used in computer science in crypto- graphy, in randomized algorithms, in various simulations, etc. So the question arises : where can we obtain necessary random digits suitable for randomi- zed computations and communication protocols ? In some applications even a simple pseudo-random generator would cope with a task.
    [Show full text]
  • Gottfried Wilhelm Leibniz, the Humanist Agenda and the Scientific Method
    3237827: M.Sc. Dissertation Gottfried Wilhelm Leibniz, the humanist agenda and the scientific method Kundan Misra A dissertation submitted in partial fulfilment of the requirements for the degree of Master of Science (Research), University of New South Wales School of Mathematics and Statistics Faculty of Science University of New South Wales Submitted August 2011 Changes completed September 2012 THE UNIVERSITY OF NEW SOUTH WALES Thesis/Dissertation Sheet Surname or Family name: Misra First name: Kundan Other name/s: n/a Abbreviation for degree as given in the University calendar: MSc School: Mathematics and Statistics Faculty: Science Title: Gottfried Wilhelm Leibniz, the humanist agenda and the scientific method Abstract 350 words maximum: Modernity began in Leibniz’s lifetime, arguably, and due to the efforts of a group of philosopher-scientists of which Leibniz was one of the most significant active contributors. Leibniz invented machines and developed the calculus. He was a force for peace, and industrial and cultural development through his work as a diplomat and correspondence with leaders across Europe, and in Russia and China. With Leibniz, science became a means for improving human living conditions. For Leibniz, science must begin with the “God’s eye view” and begin with an understanding of how the Creator would have designed the universe. Accordingly, Leibniz advocated the a priori method of scientific discovery, including the use of intellectual constructions or artifices. He defended the usefulness and success of these methods against detractors. While cognizant of Baconian empiricism, Leibniz found that an unbalanced emphasis on experiment left the investigator short of conclusions on efficient causes.
    [Show full text]
  • Measures of Maximal Entropy for Shifts of Finite Type
    Measures of maximal entropy for shifts of finite type Ben Sherman July 7, 2013 Contents Preface 2 1 Introduction 9 2 Dynamical systems 10 3 Topological entropy 13 4 Measure-preserving dynamical systems 20 5 Measure-theoretic entropy 22 6 Parry measure 27 7 Conclusion 30 Appendix: Deriving the Parry measure 30 1 Preface Before I get to the technical material, I thought I might provide a little non-technical background as to why randomness is so interesting, and why unpredictable dynamical systems (i.e., dynamical systems with positive entropy) provide an interesting explanation of randomness. What is randomness? \How dare we speak of the laws of chance? Is not chance the antithesis of all law?" |Joseph Bertrand, Calcul des probabilit´es, 1889 Throughout history, phenomena which could not be understood in a mundane sense have been ascribed to fate. After all, if something cannot be naturally understood, what else but a supernatural power can explain it? Concepts of chance and randomness thus have been closely associated with actions of deities. After all, for something to be random, it must, by definition, defy all explanation. Many ancient cultures were fascinated with games of chance, such as throwing dice or flipping coins, and interpreted their outcomes as prescriptions of fate. To play a game of chance was to probe the supernatural world. In ancient Rome, the goddess Fortuna was the deity who determined fortune and fate; the two concepts were naturally inseparable. Randomness, in many ways, is simply a lack of understanding. Democritus of an- cient Greece realized the subjectivity of randomness with an explanatory story (Wikipedia, 2012).
    [Show full text]
  • 115. Studies in the History of Statistics and Probability, Vol. 18
    Studies in the History of Statistics and Probability Vol. 18 Compiled by Oscar Sheynin Berlin 2020 Contents I am the author of all the contributions listed below Notation I. Prehistory of the theory of probability, 1974 II. Poisson and statistics, 2012 III. Simon Newcomb as a statistician, 2002 IV. Mathematical treatment of astronomical observations, 1973 V. Gauss and geodetic observations, 1994 VI. Gauss, Bessel and the adjustment of triangulation, 2001 VII. The theory of probability. Definition and relation with statistics, 1998 [email protected] 2 Notation Notation S, G, n refers to downloadable file n placed on my website www.sheynin.de which is being diligently copied by Google (Google, Oscar Sheynin, Home). I apply this notation in case of sources either rare or those in my translation into English. L, M, R = Leningrad, Moscow, in Russian 3 I On the Prehistory of the Theory of Probability Arch. Hist. Ex. Sci., vol. 12, N. 2, 1974, pp. 97 – 141 1. Introduction Evidently, none of the traditional sciences busies itself about the accidental, says ARISTOTLE1, continuing that this (the accidental) none of the recognized sciences considers, but only sophistic‚ and repeats himself m other places2. However, this opinion is wide of the mark since neither does the modern theory of probability busy itself with chance, but rather with the laws of chance, with the probable2a. And ARISTOTLE describes rhetoric as an art of persuasion based on probabilities (§ 3.2). Moreover, reasoning on the probable abound in various sciences in antiquity. The study of this aspect of various sciences before the origin of the theory of probability (i.
    [Show full text]
  • The Evolution of Technical Analysis Lo “A Movement Is Over When the News Is Out,” So Goes Photo: MIT the Evolution the Wall Street Maxim
    Hasanhodzic $29.95 USA / $35.95 CAN PRAISE FOR The Evolution of Technical Analysis Lo “A movement is over when the news is out,” so goes Photo: MIT Photo: The Evolution the Wall Street maxim. For thousands of years, tech- ANDREW W. LO is the Harris “Where there is a price, there is a market, then analysis, and ultimately a study of the analyses. You don’t nical analysis—marred with common misconcep- & Harris Group Professor of Finance want to enter this circle without a copy of this book to guide you through the bazaar and fl ash.” at MIT Sloan School of Management tions likening it to gambling or magic and dismissed —Dean LeBaron, founder and former chairman of Batterymarch Financial Management, Inc. and the director of MIT’s Laboratory by many as “voodoo fi nance”—has sought methods FINANCIAL PREDICTION of Technical Analysis for Financial Engineering. He has for spotting trends in what the market’s done and “The urge to fi nd order in the chaos of market prices is as old as civilization itself. This excellent volume published numerous papers in leading academic what it’s going to do. After all, if you don’t learn from traces the development of the tools and insights of technical analysis over the entire span of human history; FINANCIAL PREDICTION FROM BABYLONIAN history, how can you profi t from it? and practitioner journals, and his books include beginning with the commodity price and astronomical charts of Mesopotamia, through the Dow Theory The Econometrics of Financial Markets, A Non- of the early twentieth century—which forecast the Crash of 1929—to the analysis of the high-speed TABLETS TO BLOOMBERG TERMINALS Random Walk Down Wall Street, and Hedge Funds: electronic marketplace of today.
    [Show full text]
  • Russian Papers on the History of Probability and Statistics Translated by the Author Berlin 2004 (C) Oscar Sheynin
    Russian Papers on the History of Probability and Statistics Translated by the Author Berlin 2004 (C) Oscar Sheynin www.sheynin.de Contents Introduction 1. Review of Kendall, M.G., Doig, A.G. Bibliography of Statistical Literature Pre-1940 with Supplements to the Volumes for 1940 – 1949 and 1950 – 1958. Edinburgh, 1968. Novye Knigi za Rubezhom , ser. A, No. 10, 1969, 2. On the work of Adrain in the theory of errors. Istoriko-Matematicheskie Issledovania (IMI), vol. 16, 1965, pp. 325 – 336 3. On the history of the iterative methods of solving systems of linear algebraic equations. Trudy IX Nauchn Konf. Aspirantov i Mladsh. Nauchn. Sotrundn. Inst. Istorii Estestvoznania iTekhniki , Sektsia istorii fiz. i mat. nauk. Moscow, 1966, pp. 8 – 12 4. On selection and adjustment of direct observations. Izvestia Vuzov. Geodezia i Aerofotos’emka No. 2, 1966, pp. 107 – 112 5. On the history of the adjustment of indirect observations. Ibidem, No. 3, 1967, pp. 25 – 32 6. Some Issues in the History of the Theory of Errors. Abstract of dissertation. Moscow, 1967. Published as a manuscript. Inst. Istorii Estestvoznania i Tekhniki 7. On the work of Bayes in the theory of probability. Trudy XII Nauchn. Konf. Aspirantov i Mladsh. Nauchn. Sotrudn. Inst. Istorii Estestvoznania I Tekhniki , Sektsia istorii mat. i mekh. nauk. Moscow, 1969, pp. 40 – 57 8. On the history of the De Moivre – Laplace limit theorem. Istoria i Metodologia Estestven. Nauk , vol. 9, 1970, pp. 199 – 211 9. On the appearance of the Dirac delta-function in a memoir of Laplace. IMI, vol. 20, 1975, pp. 303 – 308 10.
    [Show full text]
  • On Randomness As a Principle of Structure and Computation in Neural Networks
    Research Collection Doctoral Thesis On randomness as a principle of structure and computation in neural networks Author(s): Weissenberger, Felix Publication Date: 2018 Permanent Link: https://doi.org/10.3929/ethz-b-000312548 Rights / License: In Copyright - Non-Commercial Use Permitted This page was generated automatically upon download from the ETH Zurich Research Collection. For more information please consult the Terms of use. ETH Library Felix Weissenberger On randomness as a principle of structure and com- putation in neural networks Diss. ETH No. 25298 2018 ONRANDOMNESSASAPRINCIPLEOFSTRUCTURE AND COMPUTATION IN NEURAL NETWORKS Diss. ETH No. 25298 On randomness as a principle of structure and computation in neural networks A thesis submitted to attain the degree of DOCTOROFSCIENCES of ETHZURICH (Dr. sc. ETH Zurich) presented by FELIXWEISSENBERGER MSc ETH in Theoretical Computer Science born on 04.08.1989 citizen of Germany accepted on the recommendation of Prof. Dr. Angelika Steger Prof. Dr. Jean-Pascal Pfister Dr. Johannes Lengler 2018 Contents Abstract iii Zusammenfassung v Thanks vii 1 Introduction 1 2 Emergence of synfire chains 19 3 Rate based learning with short stimuli 79 4 Mutual inhibition with few inhibitory cells 109 5 Lognormal synchrony in CA1 125 Bibliography 163 i Abstract This work examines the role of randomness in structure and informa- tion processing of biological neural networks and how it may improve our understanding of the nervous system. Our approach is motivated by the pragmatic observation that many components and processes in the brain are intrinsically stochastic. Therefore, probability theory and its methods are particularly well suited for its analysis and modeling.
    [Show full text]
  • Randomness Extractors in Mobile Devices
    MASARYK UNIVERSITY FACULTY}w¡¢£¤¥¦§¨ OF I !"#$%&'()+,-./012345<yA|NFORMATICS Randomness extractors in mobile devices MASTER’S THESIS Filip Jurneˇcka Brno, Spring 2010 Declaration Hereby I declare, that this paper is my original authorial work, which I have worked out by my own. All sources, references and literature used or excerpted during elaboration of this work are properly cited and listed in complete reference to the due source. Brno, May 25, 2010 Filip Jurneˇcka Advisor: RNDr. Jan Bouda, Ph.D. ii Acknowledgement I would like to thank my mother for her unyielding support and belief in me. I would also like to thank all of those who helped me to get through my studies. iii Abstract Objective of this thesis is to give an overview of the problematics of ran- domness extractors with focus on searching an extractor suitable for gen- erating random numbers for cryptographic applications in mobile devices. Selected extractors based on their suitability for given application will be implemented in mobile device on a platform chosen by student. iv Keywords Evaluation hash, extractor, JavaME, mobile, pseudorandom, randomness, shift register hash, truly random, weak source. v Contents Chapter outline . 3 1 Introduction ............................... 5 1.1 Troubles with implementations of PRNGs ........... 6 1.2 Usage of randomness ....................... 7 1.2.1 Deterministic vs randomized algorithms . 7 1.2.2 Randomness in cryptography . 10 2 Sources of randomness ........................ 13 2.1 Definitions ............................. 13 2.2 Weak random sources ...................... 15 3 Randomness extractors ........................ 20 3.1 Preliminaries ............................ 20 3.2 Definitions ............................. 23 3.3 Tradeoffs .............................. 24 3.3.1 Simulating BPP . 24 3.3.2 Lower bounds .
    [Show full text]