The Stochastic Thermodynamics of Computation
Total Page:16
File Type:pdf, Size:1020Kb
Journal of Physics A: Mathematical and Theoretical ACCEPTED MANUSCRIPT The stochastic thermodynamics of computation To cite this article before publication: David Wolpert et al 2019 J. Phys. A: Math. Theor. in press https://doi.org/10.1088/1751-8121/ab0850 Manuscript version: Accepted Manuscript Accepted Manuscript is “the version of the article accepted for publication including all changes made as a result of the peer review process, and which may also include the addition to the article by IOP Publishing of a header, an article ID, a cover sheet and/or an ‘Accepted Manuscript’ watermark, but excluding any other editing, typesetting or other changes made by IOP Publishing and/or its licensors” This Accepted Manuscript is © 2018 IOP Publishing Ltd. During the embargo period (the 12 month period from the publication of the Version of Record of this article), the Accepted Manuscript is fully protected by copyright and cannot be reused or reposted elsewhere. As the Version of Record of this article is going to be / has been published on a subscription basis, this Accepted Manuscript is available for reuse under a CC BY-NC-ND 3.0 licence after the 12 month embargo period. After the embargo period, everyone is permitted to use copy and redistribute this article for non-commercial purposes only, provided that they adhere to all the terms of the licence https://creativecommons.org/licences/by-nc-nd/3.0 Although reasonable endeavours have been taken to obtain all necessary permissions from third parties to include their copyrighted content within this article, their full citation and copyright line may not be present in this Accepted Manuscript version. Before using any content from this article, please refer to the Version of Record on IOPscience once published for full citation and copyright details, as permissions will likely be required. All third party content is fully copyright protected, unless specifically stated otherwise in the figure caption in the Version of Record. View the article online for updates and enhancements. This content was downloaded from IP address 129.59.95.115 on 06/03/2019 at 17:27 Page 1 of 110 AUTHOR SUBMITTED MANUSCRIPT - JPhysA-110712.R1 1 2 3 The Stochastic Thermodynamics of Computation 4 5 6 David H. Wolpert 7 Santa Fe Institute, Santa Fe, New Mexico 8 9 Complexity Science Hub, Vienna 10 Arizona State University, Tempe, Arizona, 11 12 http://davidwolpert.weebly.com 13 14 One of the central concerns of computer science is how the resources needed to per- 15 16 form a given computation depend on that computation. Moreover, one of the major re- 17 source requirements of computers — ranging from biological cells to human brains to high- 18 19 performance (engineered) computers — is the energy used to run them, i.e., the thermody- 20 namic costs of running them. Those thermodynamic costs of performing a computation has 21 22 been a long-standing focus of research in physics, going back (at least) to the early work of 23 Landauer, in which he argued that the thermodynamic cost of erasing a bit in any physical 24 25 system is at least kT ln[2]. 26 One of the most prominent aspects of computers is that they are inherently non- 27 28 equilibrium systems. However, the research by Landauer and co-workers was done when 29 non-equilibrium statistical physics was still in its infancy, requiring them to rely on equilib- 30 31 rium statistical physics. This limited the breadth of issues this early research could address, 32 leading them to focus on the number of times a bit is erased during a computation — an 33 34 issue having little bearing on the central concerns of computer science theorists. 35 Since then there have been major breakthroughs in nonequilibrium statistical physics, 36 37 leading in particular to the new subfield of “stochastic thermodynamics”. These break- 38 throughs have allowed us to put the thermodynamic analysis of bit erasure on a fully formal 39 40 (nonequilibrium) footing. They are also allowing us to investigate the myriad aspects of the 41 relationship between statistical physics and computation, extending well beyond the issue of 42 43 how much work is required to erase a bit. 44 In this paper I review some of this recent work on the “stochastic thermodynamics of 45 46 computation”. After reviewing the salient parts of information theory, computer science the- 47 ory, and stochastic thermodynamics, I summarize what has been learned about the entropic 48 49 costs of performing a broad range of computations, extending from bit erasure to loop-free 50 circuits to logically reversible circuits to information ratchets to Turing machines. These 51 52 results reveal new, challenging engineering problems for how to design computers to have 53 minimal thermodynamic costs. They also allow us to start to combine computer science 54 55 theory and stochastic thermodynamics at a foundational level, thereby expanding both. 56 57 58 59 60 Accepted Manuscript AUTHOR SUBMITTED MANUSCRIPT - JPhysA-110712.R1 Page 2 of 110 2 1 2 3 I. INTRODUCTION 4 5 6 A. Computers ranging from cells to chips to human brains 7 8 9 Paraphrasing Landauer, real world computation involves thermodynamic costs [3] — and those 10 costs can be massive. Estimates are that computing accounts for ∼ 4% of current total US energy 11 12 usage in 2018 — and all of that energy ultimately goes into waste heat. 13 14 Such major thermodynamic costs arise in naturally occurring, biological computers as well as 15 the artificial computers we have constructed. Indeed, the comparison of thermodynamic costs in 16 17 artificial and biological computers can be fascinating. For example, a large fraction of the energy 18 19 budget of a cell goes to translating RNAs into sequences of amino acids (i.e., proteins), in the cell’s 20 21 ribosome. The thermodynamic efficiency of this computation — the amount of heat produced by a 22 ribosome per elementary operation — is many orders of magnitude superior to the thermodynamic 23 24 efficiency of my current artificial computers [4]. Are there “tricks” that cells use to reduce their 25 26 costs of computation that we could exploit in our artificial computers? 27 More speculatively, the human brain consumes ∼ 10 − 20% of all the calories a human being 28 29 requires to live [5]. The need to continually gather all those extra calories is a massive evolutionary 30 31 fitness cost that our ancestors faced. Does this fitness cost explain why human levels of intelligence 32 (i.e., of neurobiological computation) are so rare in the history of life on earth? 33 34 These examples involving both artificial and biological computers hint at the deep connections 35 36 between computation and statistical physics. Indeed, the relationship between computation and 37 statistical physics has arguably been a major focus of physics research at least since Maxwell’s demon 38 39 was introduced. A particularly important advance was made in the middle of the last century when 40 41 Landauer (and then Bennett) famously argued that erasing a bit in a computation results in a 42 43 entropic cost of at least kT ln[2], where kB is Boltzmann’s constant and T is the temperature of a 44 heat bath connected to the system. 45 46 This early work was grounded in the tools of equilibrium statistical physics. However, computers 47 48 are highly nonequilbrium systems. As a result, this early work was necessarily semiformal, and there 49 were many questions it could not address. 50 51 On the other hand, in the last few decades there have been major breakthroughs in nonequi- 52 53 librium statistical physics. Some of the most important of these breakthroughs now allow us to 54 analyze the thermodynamic behavior of any system that can be modeled with a time-inhomogeneous 55 56 continuous-time Markov chain (CTMC), even if it is open, arbitrarily far from equilibrium, and un- 57 58 59 60 Accepted Manuscript Page 3 of 110 AUTHOR SUBMITTED MANUSCRIPT - JPhysA-110712.R1 3 1 2 3 dergoing arbitrary external driving. In particular, we can now decompose the time-derivative of 4 5 the (Shannon) entropy of such a system into an “entropy production rate”, quantifying the rate 6 7 of change of the total entropy of the system and its environment, minus a “entropy flow rate”, 8 quantifying the rate of entropy exiting the system into its environment. Crucially, the entropy 9 10 production rate is non-negative, regardless of the CTMC. So if it ever adds a nonzero amount to 11 12 system entropy, its subsequent evolution cannot undo that increase in entropy. (For this reason 13 it is sometimes referred to as irreversible entropy production.) This is the modern understanding 14 15 of the second law of thermodynamics, for systems undergoing Markovian dynamics. In contrast 16 17 to entropy production, entropy flow can be negative or positive. So even if entropy flow increases 18 system entropy during one time interval (i.e., entropy flows into the system), often its subsequent 19 20 evolution can undo that increase. 21 22 This decomposition of the rate of change of entropy is the starting point for the field of “stochastic 23 thermodynamics”. However, the decomposition actually holds even in scenarios that have nothing 24 25 to do with thermodynamics, where there are no heat baths, work reservoirs, or the like coupled 26 27 to the system. Accordingly, I will refer to the dynamics of the entropy production rate, entropy 28 29 flow rate, etc. as the “entropy dynamics” of the physical system. In addition, integrating this 30 decomposition over a non-zero time interval, we see that the total change in entropy of the system 31 32 equals the total entropy flow plus the total entropy production.