<<

Santa Fe Institute Working Paper 2017-01-02 arxiv.org:1612.08459 [cond-mat.stat-mech] Thermodynamics of

Cina Aghamohammadi∗ and James P. Crutchfield† Complexity Sciences Center and Department of Physics, University of California at Davis, One Shields Avenue, Davis, CA 95616 (Dated: January 6, 2017) We analyze the thermodynamic costs of the three main approaches to generating random num- bers via the recently introduced Information Processing Second Law. Given access to a specified source of , a random number generator (RNG) produces samples from a desired tar- get probability distribution. This differs from pseudorandom number generators (PRNG) that use wholly deterministic algorithms and from true random number generators (TRNG) in which the randomness source is a physical system. For each class, we analyze the thermodynamics of genera- tors based on algorithms implemented as finite-state machines, as these allow for direct bounds on the required physical resources. This establishes bounds on heat dissipation and work consumption during the operation of three main classes of RNG algorithms—including those of von Neumann, Knuth and Yao, and Roche and Hoshi—and for PRNG methods. We introduce a general TRNG and determine its thermodynamic costs exactly for arbitrary target distributions. The results highlight the significant differences between the three main approaches to random number generation: One is work producing, one is work consuming, and the other is potentially dissipation neutral. Notably, TRNGs can both generate random numbers and convert thermal energy to stored work. These thermodynamic costs on information creation complement Landauer’s limit on the irreducible costs of information destruction.

PACS numbers: 05.70.Ln 89.70.-a 05.20.-y 02.50.-r Keywords: entropy rate, nonequilibrium steady state, Maxwell’s Demon, Second Law of Thermodynamics

I. INTRODUCTION The recurrent problem in all of these strategies is demonstrating that the numbers generated were, in fact, Random number generation is an essential tool these random. This concern eventually lead to Chaitin’s and days in simulation and analysis. Applications range from Kolmogorov’s attempts to find an algorithmic foundation statistical sampling [1], numerical simulation [2], cryp- for probability theory [21–26]. Their answer was that an tography [3], program validation [4], and numerical anal- object is random if it cannot be compressed: random ysis [5] to machine learning [6] and decision making in objects are their own minimal description. The theory games [7] and in politics [8]. More practically, a signif- exacts a heavy price, though: identifying randomness is icant fraction of all the simulations done in physics [9] uncomputable [25]. employ random numbers to greater or lesser extent. Despite the formal challenges, many physical systems Random number generation has a long history, full of appear to behave randomly. Unstable nuclear decay pro- deep design challenges and littered with pitfalls. Initially, cesses obey Poisson statistics [27], thermal noise obeys printed tables of random digits were used for scientific Gaussian statistics [28], cosmic background radiation ex- work, first documented in 1927 [10]. A number of analog hibits a probabilistically fluctuating temperature field physical systems, such as reversed-biased Zener diodes [29], quantum state measurement leads to stochastic out- [11] or even Lava® Lamps [12], were also employed as comes [30–32], and fluid turbulence is governed by an un- sources of randomness; the class of so-called noise gen- derlying chaotic dynamic [33]. When such physical sys- erators. One of the first digital machines that generated tems are used to generate random numbers one speaks random numbers was built in 1939 [13]. With the advent of true random number generation [34]. of digital computers, analog methods fell out of favor, dis- Generating random numbers without access to a source placed by a growing concentration on arithmetical meth- of randomness—that is, using arithmetical methods on a ods that, running on deterministic digital computers, of- deterministic finite-state machine, whose logic is phys- fered flexibility and reproducibility. An early popular ically isolated—is referred to as pseudorandom number approach to digital generation was the linear congruen- generation, since the numbers must eventually repeat and tial method introduced in 1950 [14]. Since then many so, in principle, are not only not random, but are ex- new arithmetical methods have been introduced [15–20]. actly predictable [35, 36]. John von Neumann was rather decided about the pseudo-random distinction: “Any one who considers arithmetical methods of producing random digits is, of course, in a state of sin” [37]. Nonetheless, ∗ [email protected] these and related methods dominate today and perform † [email protected] 2 well in many applications. II. RANDOM NUMBER GENERATION Sidestepping this concern by assuming a given source of randomness, random number generation (RNG) [38] Take a fair coin as our source of randomness.1 Each flip is a complementary problem about the transformation of results in a Head or a Tail with 50% 50% probabilities. − randomness: Given a specific randomness source, whose However, we need a coin that 1/4 of the time generates statistics are inadequate somehow, how can we convert Heads and 3/4 of the time Tails. Can the series of fair it to a source that meets our needs? And, relatedly, how coin flips be transformed? One strategy is to flip the coin efficiently can this be done? twice. If the result is Head-Head, we report Heads. Else, Our interest is not algorithmic efficiency, but ther- we report Tails. The reported sequence is equivalent to modynamic efficiency, since any practical generation of flipping a coin with a bias 1/4 for Heads and 3/4 for random numbers must be physically embedded. What Tails. are the energetic costs—energy dissipation and power Each time we ask for a sample from the biased distri- inputs—to harvest a given amount of information? This bution we must flip the fair coin twice. Can we do better? is a question, at root, about a particular kind of infor- The answer is yes. If the first flip results in a Tail, inde- mation processing—viz., information creation—and the pendent of the second flip’s result, we should report Tail. demands it makes on its physical substrate. In this light, We can take advantage of this by slightly modifying the it should be seen as exactly complementary to Landauer’s original strategy. If the first flip results in a Tail, stop. well known limit on the thermodynamic costs of informa- Do not flip a second time, simply report a Tail, and start tion destruction (or erasure) [39, 40]. over. With this modification, 1/2 of the time we need a Fortunately, there has been tremendous progress single flip and 1/2 the time we need two flips. And so, bridging information processing and the nonequilibrium on average we need 1.5 flips to generate the distribution thermodynamics required to support it [41, 42]. This of interest. This strategy reduces the use of the fair coin information thermodynamics addresses processes that “resource” by 25%. range from the very small scale, such as the operation Let’s generalize. Assume we have access to a source of nanoscale devices and molecular dynamics [43], to the randomness that generates the distribution p : i { i ∈ A} cosmologically large, such the character and evolution over discrete alphabet . We want an algorithm that A of black holes [44, 45]. Recent technological innovations generates another target distribution q : j from { j ∈ B} allowed many of the theoretical advances to be experi- samples of the given source. (Generally, the source of mentally verified [46, 47]. The current state of knowl- randomness p can be known or unknown to us.) In { i} edge in this rapidly evolving arena is reviewed in Refs. this, we ask for a single correct sample from the target [48–50]. Here, we use information thermodynamics to distribution. This is the immediate random number gen- describe the physical limits on random number genera- eration problem: Find an algorithm that minimizes the tion. Though the latter is often only treated as a purely expected number of necessary samples of the given source abstract mathematical subject, practicing scientists and to generate one sample of the target.2 engineers know how essential random number generation The goal in the following is to analyze the thermo- is in their daily work. The following explores the under- dynamic costs when these algorithmically efficient algo- lying necessary thermodynamic resources. rithms are implemented in a physical substrate. This First, Sec.II addresses random number generation, question parallels that posed by Landauer [39, 40]: What analyzing the thermodynamics of three algorithms, and is the minimum thermodynamic cost to erase a bit of in- discusses physical implementations. Second, removing formation? That is, rather than destroying information, the requirement of an input randomness source, Sec.III we analyze the costs of creating information with desired turns to analyze pseudorandom number generation and statistical properties given a source of randomness. its costs. Third, Sec.IV analyzes the thermodynamics Bounding the Energetics: The machine implementing of true random number generation. Finally, the conclu- the algorithm transforms symbols on an input string sam- sion compares the RNG strategies and their costs and pled from an information reservoir to an output symbol suggests future problems.

1 Experiments reveal this assumption is difficult if not impossible to satisfy. Worse, if one takes the full dynamics into account, a flipped physical coin is quite predictable [51]. 2 A companion is the batch random number generation problem: Instead of a single sample, generate a large number of inputs and outputs. The challenge is to find an algorithm minimizing the ratio of the number of inputs to outputs [52–54]. 7 7 7 7 processing [65] and as a programming model for engi- 7 7 processing [65] and as a programming model for engi- neering artificial systems [66, 67]. Moreover, CRN chem- processingprocessing [65] and [65] as and a programming as a programming model model for engi- for engi- processingneeringprocessing artificial [65] and [65 systems] as and a programming as [66 a, 67 programming]. Moreover, model model CRN for engi- chem- for engi- neeringicalneering implementations artificial artificial systems systems have [66, been67 [].66 studied Moreover,, 67]. Moreover, in detail CRN chem- [CRN69, 70 chem-]. neeringicalneering implementations artificial artificial systems systems have [66, been67 [].66 studied Moreover,, 67]. Moreover, in detail CRN chem- [CRN69, 70 chem-]. icalCRNs implementationsical are implementations also e haveciently been have Turing-universal studied been studied in detail in [68 detail [69],, which70 [].69, 70]. 7 icalCRNs implementationsical are implementations also e haveciently been have Turing-universal studied been studied in detail in [68 detail [69],, which70 [].69, 70]. CRNspowerCRNs are makes also are them e alsociently appealing. eciently Turing-universal Turing-universal One of their [68 main], which [68 appli-], which CRNspowerCRNs are makes also are them e alsociently appealing. eciently Turing-universal Turing-universalOne of their [68 main], which [68 appli-], which 7 7 powercationsprocessingpower makes is makesdeterministic them [65] and themappealing. asappealing. a function programming One computationof One their model of main their for engi- [ appli-main71, 72 appli-], powercationspower makes is makesdeterministic themneering themappealing. artificial appealing. function systems One [66 computationof One, 67 their]. of Moreover, main their [ appli-main71 CRN, 72 chem- appli-], cationswhichprocessingcations is isprocessing deterministicwhat [65 is] ourdeterministic and [65 RNGs as] and a function programming as need. a function programming computation model computation modelfor engi- [71 for, 72 engi- [71], , 72], which is whatical our implementations RNGs need. have been studied in detail [69, 70]. cationscations is deterministicneeringConsider is deterministicneering artificial four artificial function systems types function systems [ of66 computation, particles67]. [66 computation Moreover,, 67]. 0, Moreover, [71 1, CRN,A72, [ chem-71],and CRN, 72B chem-], and whichCRNswhich is what are is our what also RNGs e ourciently RNGs need. Turing-universal need. [68], which whichConsiderwhich is whatical is four our what implementationsical RNGs types our implementations RNGs need.of particles have need. been have 0, studied been 1, A studied in, and detail inB [69 detailand, 70]. [69, 70]. aConsider machinepowerConsider makes four consisting types them four appealing. types of of particles a of box Oneparticles that 0, of their1, canA 0,, main contain 1, andA appli-,B and them.andB and aConsider machineConsiderCRNs four consisting typesCRNs arefour also types of are of e particles aalsociently of box e particlesciently that Turing-universal 0, 1, can Turing-universalA 0,, contain 1, andA,B [68 and],and them. whichB [68],and which aParticles machinecationsa machine consisting0 is and deterministic consisting 1 can of be a function box ofinputs a that box computation or can that outputs contain can [71 to contain, 72 the them.], ma- them. aParticles machinea machine consisting0power and consistingmakespower 1 can of themmakes be a box ofinputsappealing. them a that box appealing. or can thatOne outputs contain of can One their to contain of main the them. their appli-ma- main them. appli- Particleschine.cationswhichParticles “Machine” 0cations is is and deterministic what 0 1is and our can deterministic particles RNGs 1 be can function inputs need. beA function inputs computationand or outputsB computation oralways outputs [71 to, stay72 the], [71 to in,ma-72 the the], ma- Particleschine.Particles “Machine” 0 and 0Consider 1 and can particles 1 be four can inputs types beA inputsand orof particles outputsB oralways outputs 0, to 1, stay theA, to and inma- the theB and ma- FIG. 3. Chemical Reaction Network (CRN) implementation chine.machine’swhichchine. “Machine” iswhich what box “Machine” is our whatand particles RNGs areour particles need.RNGs inA contactand need.ABand withalwaysB aalways thermal stay in stay reser- theFIG. in the 3. Chemical Reaction Network (CRN) implementation chine.machine’schine. “Machine” box “Machine”a machine and particles are consisting particles inA contactand ofAB aand with boxalwaysB that aalways thermal stay can contain in stay reser- the in them. the FIG.of an 3.FIG. RNG Chemical 3. machine Chemical Reaction consisting Reaction Network of Network a (CRN) box and (CRN) implementation a particle implementation in it. machine’svoir.Considermachine’s Figure boxConsider four3 andshows box types arefour and that of in types are particles contact the of in left particles contact 0, with wall 1, A 0, is,a with and thermal1,designedAB, a andand thermal reser-B soand thatof anreser- RNG machine consisting of a box and a particle in it. voir. FigureParticles3 shows 0 that and the1 can left be wall inputs is ordesigned outputs so to that the ma-FIG. 3.FIG. ChemicalofThe an 3. ChemicalRNGof left an Reaction wall machine RNG actsReaction machine Network as consisting a membraneNetwork consisting (CRN) of a (CRN) implementation box filter of aand suchimplementation box a andparticlethat a only particle in input it. in it. machine’smachine’svoir.only boxa Figure machinevoir. inputand boxa Figure machineare3 andparticles consistingshows in are contact3 consistingshows that in (0 of contact anda the withthat box of 1)left thata the a can with box wall thermal canleft enter, that a is containwall thermal designed can reser-but is contain designedthem. no reser- so particles thatthem.The so that left wall acts as a membrane filter such that only input Particleschine.Particles “Machine”0 and 1 0 can and particles be 1 can inputs beA and inputsor outputsB always or outputs to stay the toma- in theof ma- an RNGofThe anparticles, machine RNG leftThe wall machine left 0consisting andacts wall 1, consistingas acts can a of membrane enter, as a box a of membrane but aand box filterno a particles andparticle such filter a particle that suchin can it. only exit that in through input it. only input voir.only Figurevoir. input Figure3 particlesshows3 shows that (0 and the that 1)left the can wall left enter, is wall designed but is designed no so particles that so that10 particles,FIG. 3. 0Chemical and 1, canReaction enter, Network but no (CRN) particles implementation can exit through onlycanchine. inputmachine’sonly exit.chine. “Machine” inputparticles The box “Machine” right particles and particles (0 wall are and particles in (0 isA 1) contact and designedand canA 1)B enter, withandalways can soB a enter, but thermalalways that stay no in onlybut stayparticles thereser- no output inThe particles the leftThe wallparticles, leftthe acts wall. wallparticles, as 0 acts andThe a membrane 1,as right0 and can a membrane wall 1,enter, can filter is also butenter, such filter a no membrane but particles that such no only particles that can designed input only exit can input through such exit that through onlycan inputonly exit. inputparticles The right particles (0 wall and (0 is 1) and designed can 1) enter, can so enter, but that no onlybut particles no output particles10 particles,theFIG.10of wall.particles, an 3.FIG. 0RNGChemicalonly andThe 3. machine 1, right0 outputChemical and canReaction wall 1,enter, consisting particles, canReaction is Network also butenter, of a no Network 0 amembrane (CRN) but and boxparticles no and 1, (CRN)implementation particlescan a can particledesigned exit. implementation exit can Atin through such it. theexit beginningthat through the KBTpln(p) 0 heat from heatcan reservoirparticlesmachine’s exit.voir.can andmachine’s The turnexit. Figure (0 box it and rightto The3 andshows 1)1 box/f wall areright cannoise and that in [exit. is88 wallcontact are]? designedthe Nominally, in To left is contact with designed get wall these so started, a is with distributionsthermal thatdesigned so a only thermal thatassume reser- so are thatoutput associ- only reser- there outputthe wall.the The wall. right The wall right is wall also isa membrane also a membrane designed designed such that such that | | ⇡ The left wall acts as a membrane filter such that only input work for us. Becausecan theparticles work exit.can is really The exit. (0 small, andonly right The this input1) process wall right can particles exit.is wallated designed with To (0is andinfinitedesigned get so 1) started, memory can that soenter, processes only thatassume but output [ only89 no]. there What particles output are thetheonlyof wall. anthe output RNGofonly The wall.only an machine right outputRNG particles,only The particle wallmachine rightconsisting output particles, is 0 in wall also and consisting particles, the of isa 1, 0 amembrane also box canand box of a exit.and is1, 0 amembrane and box can“machine a At particledesigned and 1,exit. the can a particledesigned beginningAtin exit.particle” such it. the Atin beginningthat such it. the theA,whichis beginningthat the the KBTpln(Kp)BTp0ln( heatp) from0 heat heat from reservoirparticlesis heatvoir. only andreservoirparticles turn Figure (0 avoir. andit single and to turn Figure3 (0shows 1)1 it/f and machine to cannoise3 thatshows 1)1 [exit./f88]? cannoise the Nominally, particlethat To left[ exit.88]? the get wall Nominally, these To left started,A is distributions get designed wallin these the started, is distributions assumedesigned box. so are that associ- assume Every sothere are that associ-particles,⌧ there 0 and 1, can enter, but no particles can exit through | is neutral| | ⇡ process| meaning⇡ there is no energy transfer associated thermodynamic cost bounds? Suggestively, The leftThe wall left acts wall as actsa membrane as a membrane filter such filter that such only that input only input work for us.work Because for us. theparticles Because work istheparticles really work (0 small, is and reallycan this (0 1) small, processexit. and can this 1)The exit. processated can right with To exit.ated wall infinite get with To is started, memory designed infinite get started, processes memory assume so that processes [89 assume].there only What [89 outputare]. there What the only areonly the outputonly particleonlyconfined output particles, particleonly in particles, the to particle 0 in boxstay and the is1, 0 in in andbox can“machine the the 1,exit. is box. box can “machine At exit.particle” is Every the “machine At beginning particle”⌧ theAseconds,whichis beginning particle” theA a,whichis new theA,whichis input between reservoir andis machine. onlyis aNow only singleonly consideris a input only singleonly machine the particles casea input single machineit particleparticles was (0 machine recentlyand particle (01)A shown andcanin particle enter,the1) thatA canin infinite-memory box. but enter, theA noin Every box. but particles the devices no Every box.⌧ particles can Everyparticles,⌧ the wall.particles,⌧ 0 and The 1, right 0 can and wall enter, 1, can is alsobut enter, noa membrane particles but no particles can designed exit can through such exit that through is neutralis process neutral meaning process meaningthere is noseconds there energy is no transfer energya new transferassociated inputassociated particle, thermodynamic thermodynamic 0 or cost 1, bounds? enters cost7 bounds? Suggestively, from Suggestively, theonlyconfined left. particleonly to particle in stay the in in box the the is box. box “machine is Every “machine particle”⌧ seconds particle”A a,whichis newA,whichis input when we use RNGis method onlyis (von a only single Neumanncan aparticles exit.single machine algorithm).can The exit.(0 machine and right particle Theachieve 1) wall can right particle thermodynamic is exit.A walldesignedin To theisA get designedin bounds sobox. started, the that [90 Every sobox.]. only Third,assume that output Every the⌧ only there random output⌧ theonly wall. outputtheconfined Theparticle wall. right particles,confined The to wall enters right stay is 0 also toand wall inthe astay 1, is membrane the alsocanbox in box. exit. a from membrane the designedAt Every box. the the beginning designed left such Every⌧ seconds and, that such⌧ the dependingseconds that a new a input on new the input between reservoirbetween reservoirand machine.seconds and machine. Nowseconds a consider new Nowseconds a inputthe consider new case a theparticle,input newit case was inputparticle, recentlyit 0 was or shownparticle, recently 1, 0 enters that or shown 1,infinite-memory 0 enters that from or 1,infinite-memory7 theenters from devices left.7 the from can devices left. the can left. In this case to run a machine andNow, generate the one symbol, particlesnumber react generation in the strategies following considered way: here are notconfined se- particleonlyconfinedparticle particleto entersreaction stayparticle to in intheenters stay the between the box box inenters thebox. from isthe box the“machine Every thebox. the input from box left Every⌧ particle” theparticle fromseconds and, left⌧ thedependingAseconds andand,,whichis a left new the depending and, a machineinput on new depending the input on particle, the on the whenInput we Outputwhen use RNG we use methodseconds RNG (von methodseconds Neumanna new (vonparticlesis Neumanna inputalgorithm). onlynewparticles (0 a inputparticle,algorithm). and singleachieve 1)(0 particle, machine canand 0thermodynamicachieve exit. 1)or can 1, particle 0thermodynamicTo enters exit. or get bounds1, started,ToA enters fromin get [90 bounds the]. started, Third,assume the from box. [90 left.]. the Third,assumethe Every there random left. the⌧ there randomonly outputonly particles,output particles, 0 and 1, 0 can and exit. 1, can At exit. the beginning At the beginning the the on average work reservoirNow, needs theNow, to work particles theNow,on the particles machine. the react particlescure. in react the However, in react following the cryptographically in following the way: following secure way: random way: numberparticlereactiononlyconfinedparticle particle entersonly between to particle in enters staythe the boxin the box in the the input from is box. box “machine box theparticlefrom Every is “machine left particle” the⌧ andand,seconds left particle” the dependingA and,,whichis a machine new dependingA input,whichis on particle, the on the InputIn this00 Output caseInputIn thisto Output run case a machine to run a and machine generate andis onlyseconds one generate symbol,is a onlysingle one a new symbol, anumber machine single input generationnumber machineparticle, particle generation strategies0 particleA orin 1, strategies considered the entersA box.in consideredfromhere the Every are box. the not here left.⌧ se-Every are not⌧ se- reactionan outputreaction between particle between the inputmay the or particle input may particle not and be the generatedand machine the machine that particle, exists particle, This thermodynamicNow, cost can theNow, be infinitely particles the large particles dependsreact in reactgenerators the in following the have beenfollowing developed way: way: [91]. What are the ad- confinedparticleconfined to enters stay to in the stay the box box. in from the Every thebox. left⌧ Everyseconds and,⌧ dependingseconds a new input a on new the input on average10on work 0 average reservoir work needsreservoir to work needsseconds on to theNow, work machine.seconds on athe the newparticles machine. inputcure. a new However, particle,0+ react inputcure.A However, cryptographically in particle, 0 the orA followingcryptographically 1,+0 enters0 or secure 1,, fromway: enters random secure the from number left.random thereaction number left.an outputreaction betweenanthrough output particle betweenan the output particlethe inputmay theright particleor particle input may may wall. or particle not mayAdd and may be theor anot generatedand may temperaturemachine be the not generated machine be that particle, generated exists reservoir:particle, that exists that existsa This00on thermodynamic howThis00 small thermodynamic the p costis. can This be cost example infinitely can be simply infinitelylarge depends tell large us how dependsgeneratorsditionalgenerators have thermodynamic been have developed been costs developed [91 of]. adding What [91 areon]. What security the ad- are to theparticle ad-reactionparticle enters between the enters box the the inputfrom box the particle from left the and, and left the depending and, machine depending on particle, the on the 11 1 ) an outputanthrough output particlethrough particle the may right theor may may wall. right or notAdd may wall. be not a generatedAdd temperature be a generated temperature that exists thatreservoir: exists reservoir: a a on10much how small 0on10 di↵ howerent the small 0p theis. random Thisthe p exampleis. number This simply example generationNow, tell simply the usNow, methods how particles tell0+ the usditionalA how particlesRNGs?0+ react thermodynamicditionalAA Finally, in0+ react+0 the thermodynamicA thereA following, in+0 costs the is aA followingsubstantialof, adding+0 way: costs of, on quantum adding way: security on advan- to securitythroughreactionan to outputreaction between therectangle particle right between the wall.input may underneath the or particleAdd input may notparticle a and temperature be the the generated and machine box, the machine with particle,that reservoir: exists “temperature particle, a T” 11could 1 be11 and each 1 of them can be useful depends on the tage)1+ whenA compressing) B,) classical random processes [76]. TABLE II. Immediatemuch random di↵mucherent number thedi↵ generation:erent random the number random The most generation number generation methods0+ methodsARNGs?0+AA0+ Finally,RNGs?+0A thereA Finally,,)+0A is a+0 there substantial, , is a substantial quantum quantum advan- through advan-rectangleanthrough outputthrough theanrectanglelabel. particleoutput the right underneath the rightrectangle particlewall.Put may right wall. underneath or theAdd wall.Add may may underneath the label or not aAdd box,temperaturemay temperature be the0 generated not aon with temperature box, be the the generated “temperature withreservoir: that particle box, reservoir: exists “temperature with that reservoir: entering a exists “temperature a T” on a T” the T” couldresources. be andcould each be andof them each can of them be useful can be depends useful on depends the1+ ontageA the)What1+ whentage areA compressingB,) the1+ when thermodynamic)A compressingB, classicalB,consequences random classical processes random of using processes [76]. such [76]. TABLEecient II.TABLE Immediate map from II. Immediate random inputs to number random output generation: number to transform generation: The fair most coin The most 0+0+A B0+) AA+0A),+0A +0, , rectanglelabel.throughrectanglerectanglethroughlabel.Put theleft. underneath right underneath thelabel. thePut Put underneath wall. label right the thePutAdd wall. the0 the label labelon atheAdd box,temperature the the0 labelA a withon withon particle box,temperature the0 “temperaturetheon “temperature with reservoir: particle particle the entering “temperature reservoir: particle entering a T” in on the T” entering a the box. on T” the on the Finally, let’s1 close with a list of very brief questions1+A1+quantum)AB,1+ representationsA B,B, for RNGs? ecientinputs mape tocient biased from map inputscoin fromresources. outputs to inputs output withresources. to to bias output transform/4. to transform fair coin fair coin What0+ areWhat theB thermodynamic0+) are)) theBA thermodynamic)+0 consequencesA, +0 consequences, of using such of usinglabel.rectangle suchlabel.label.Putrectangleleft.Put underneath thePut Put theleft. label underneath the label the Put0 label the0 labelonon the box, the0 theA labelon with particleon particle box, the “temperaturetheA withon particle entering particle entering “temperaturethe particle entering on T” in the on the the T” in box. on the the box. Finally,that allude let’sFinally,1 to close several let’s with1 close future a list withof directions very a list brief of in very questions the brief thermo-0+ questionsquantumB)1+Temperaturequantum representationsAA)1++0 representationsTB,A, for RNGs?B, for RNGs? left. Put the label A on the particle in the box. inputs toinputs biased to coin biased outputs coin with outputs bias with/4. bias /4. 1+0+B) AA)+0+1, . label.left.Putlabel. Put the thePut label label the0A labelonon the the0 on particle particle the particle entering in the entering box. on the on the thatdynamics alludethat to of allude several random to future several number directions future generation. directions in theGiven thermo- in0+ the that thermo-B0+Temperature)BATemperature+0)T A, +0)T , left. Putleft. the Put label theA labelon theA on particle the particle in the in box. the box. 1+0+B1+0+))ABA+0+1,AA.+0+1, . left. Putleft. the Put label theA labelon theA on particle the particle in the inbox. the box. dynamicsrandomdynamics of number random of generation number random generation. numberis such a generation. critical Given and that Given vital1+B that) 1+A)B)+1.A)+1. ) )ACKNOWLEDGMENTS) ministically to the output.randomtask Forin numberrandom modern example, generation number computing, in our generation is problem such following aThe is critical such up time on a and critical this vital1+ period could andB vital1+ of1+B eachAB+11+ chemicalA).AB+1+1..A reaction+1. is also ⌧.With digital computers generate random numbers using purely ministicallyif inputministically 0 to is the read, output. to thetask thebe output Forin output. quite moderntask example, wouldimportant. Forin computing, modern example, be in our 0. First, computing, However, problem followingin is our Szilard’s problem if following up Engine on this up a could TRNG? on this could) )ACKNOWLEDGMENTS) ACKNOWLEDGMENTS) The timeFIG.The 3. periodtime time Chemical period period of Reaction each of each of Network chemical each chemical (CRN) chemical reaction reaction implementation reaction is is also also⌧.With is⌧.With also ⌧.Withdigitaldigital computersdigital computers generate computers generate random generate numbers random random using numbers purely numbers using purelyusing purely if input1 is read, 0if is input read, the 0 machine the is read,be output quiteWhat should the wouldimportant.be are output quite waitthe be thermodynamic wouldimportant. for 0. First, the However,The be is next 0. Szilard’s First, time However, inputcosts? if isthis EngineSzilard’s periodFIG.Oneof if assumption recent 3. an a Engine TRNG?Chemical RNGFIG. of analysis each3. machine a TRNG?Chemical Reaction chemical it consistingWe is ReactionNetwork thank not ofAreaction(CRN) aNetwork hard box Aghamohammadi, and implementation to (CRN) a particle showis alsoimplementation in M that⌧ it. Anvari,.With if theA Boyd, distri-digital computersdeterministic generate arithmetical random numbers methods. using This purely is pseudoran- The timeThe period timeThe periodtime ofThe each period time of chemical each ofperiodWeM each Khorrami, thank chemical ofWe chemical Areaction each thank Aghamohammadi, John chemical Mahoney,Areaction reaction Aghamohammadi, is also andreaction is M is⌧ P also Anvari,.With alsoRiechers⌧ is M.With⌧ Aalso Anvari,.With for Boyd, helpful⌧.Withdigital A Boyd,digitaldeterministicdigital computers computersdigitaldeterministic computers computersdeterministic arithmetical generate generate arithmetical generate generate random random methods. arithmetical random numbers random numbers methods. This numbers using is numbers methods.pseudoran- using purely using This purelyusing purely is Thispseudoran- purely is pseudoran- 1 isand read, then1 theis generate read, machine the anWhat machine should output.appears areWhat waitthe should to How thermodynamic have forare to waitthe provided implement thermodynamic next for the inputcosts? the thesenextthis answers One inputcosts?of assumption recent anThethis [87 RNG One] left andanalysisof assumption assumption recent machinewallan antici- RNG acts analysis it consisting machineas isa it membrane not is it consisting of not is a hard box hardnotfilter and of suchto a hard to a box particle show thatshow and to only1 a in that particlethat show it.input1 if ifin the that it. the distri- ifdistri- thedeterministic distri-dom number arithmetical generation methods.(PRNG). This Can is pseudoran- these methods be appears toappears have provided to havethis provided the answers assumption thebutionThethis [ answers87particles,] left and assumption ofwallThethis [antici-87 it input]0 acts left andis assumption wallasnot 1, antici-M acan particlesit actsmembraneKhorrami,discussions. hard enter, is as notM a but it membraneKhorrami, to hardfilterJohn is no 0JPC shownot particles andsuch Mahoney, to thanks hardfilterJohn thatshow 1 that can suchisMahoney, onlytheand to exit that Santathat show ifPinput through/2 Riechers the, ifonly and Fe/ thatthe2 InstitutePinputdistri- forRiechersthen distri- if helpful the for the for distri- its helpfuldeterministic dis- deterministic arithmetical arithmetical methods. methods. This is Thispseudoran- is pseudoran- anddelays? thenand generate Let’s then explore generate an output. apate chemical an TRNG’s output.How toimplementation win-win implement How to property. implement these of the Second, thesebution the sources of input of particles 0 and 1 is 1/2, 11/2 1then1 the dis-deterministicdomdeterministicdom number number arithmeticaldom generation number arithmetical generation(PRNG). methods. generation(PRNG). methods. Can these This(PRNG). methods Can is Thispseudoran- these Can is bepseudoran- methodsthese methods be be this assumptionthisbution assumptionparticles,thebution of wall. itparticles, input0 andis The ofnot 1, right it can inputparticles0discussions. andis hard enter,wall not 1, is can butparticles alsodiscussions. to hard enter, no a 0JPC showmembrane particles and but thanks to1 no 0JPC 1 that showcan particles designedand is the1 exit thanks Santa if through/1{ that can2 such the, is the exitFe/ that2 Santa ifInstitutedistri- through/2then the}, Fe/2 Institutedistri-for thethen its dis- for the its dis- delays? Let’sdelays? explore Let’s apate explore chemicalrandomness TRNG’s apate chemical implementation win-winTRNG’s and target implementation property. win-winbution distributions of the property. Second, oftribution of inputbution we the the Second, considered sources particlesbution of the ofinput of were sources output of particles inputhospitality 0of and particles particles 0 1 during and is 1 visits 0/ 0is2 and, and as1//{22 1an, 1 is 1External/then2 would1/then}2, the1 Faculty/2 the bethen dis- dis- member.3 the/4, dis-1domdom/4 , number numberdomimplemented number generation generation generation(PRNG). by(PRNG). finite-state(PRNG). Can these Can Can machines? methods these methods methods be Most be be certainly. algorithm. theonlytribution wall. output Thethe wall. right particles, of The wall outputhospitality right is 0 also and wall ahospitalityparticles1, membraneduring is can also exit.1 visits a membraneduring designed0 At1 asand the an1 visits{ beginning External1 such designed would1 as that an{ FacultyExternal the such} be that3 member./ Faculty4}3, 1/4 1 member., 3 implemented1 by finite-state machines? Most certainly. algorithm.algorithm. randomnessratherrandomness limited and target whenbution and distributionscompared targetbutionof distributionstribution toinput thewe considered very oftribution wideinputparticles weof considered range were output particles of of 0 were outputandThis particles material 01 andis particles is/ 1{ 02 based, is and/{2 upon/ 02 1then}, and wouldwork/{2} the supported1then would be dis-} the by,/{ be4 dis- or, dom in/4 /,4}, numberdom/4implemented, number generationimplemented generation by(PRNG). finite-state by(PRNG). finite-state Can machines? these Can machines? methodsthese Most methods be Most certainly. be certainly. Chemical reaction networks (CRNs) [64, 65]havebeen onlytributiononly output particleonlytribution particles, of output output in the particles,0 of and box output particles 1, is can 0 “machine and exit. particles 1, 0 Atcan and particle” the exit. beginning 1 0 At would andA the,whichis beginning 1the be would3 3/{ the14, be1/4 3,/}4, 1/implementedimplemented4 , implemented by by finite-state finite-state by finite-state machines? machines? machines? Most certainly. Most Most certainly. certainly. ratherstochastic limitedrather processes when limited compared whenusedtribution in compared to contemporary therespectively. very of to wide therespectively. output science.very range wide of Forparticles range Thus,This Thus,part of material by,this thisThis 0 the and ismaterialCRN John{ based 1 Templeton gives would uponisgives{ based} awork Foundation physical a upon be physicalsupported} work/4 andimplemen-, supported by,/ U.{4 implemen- orS., Army in by,{} orThe in } e↵Theective e memory↵ective in memory these machines in these is very machines large, is very large, ChemicalChemical reaction networksreaction networks(CRNs) [(CRNs)64, 65]havebeen [64, 65]havebeenonlyconfined particleonly to in particle stay the in box inthe is the box. “machine box Every is “machine particle”⌧ seconds particle”A,whichis a newA3 input,whichis{1 3 {}1 implemented} implementedThe e byThe↵ective finite-state e by↵ective memory finite-state memory machines? in thesemachines? inthese Mostmachines Mostmachinescertainly. is certainly. very is large, very large, widely considered asstochastic substratesstochastic processes for physical processes usedtribution ininformation contemporary usedtributionrespectively. in of contemporaryrespectively. outputrespectively. science. ofrespectively. output Forparticles science. Thus, Thus,part ForparticlesResearch Thus, by,this this Thus,0 thepart and CRN John Laboratory by,this this 0 1 Templetonthe givesand would givesCRN John and 1 aTempleton theFoundationgives physicalwould a gives be U. physical S. a ArmyFoundation physical/{ a be4 andimplemen-, physical Research/ U.4 implemen- S./,4 andimplemen-} Army, O/ U.4 implemen-ce S., ArmyThe e↵Theective e↵ memoryective memory in these in machines these machines is very is large, very large, widely consideredwidely considered as substratesexample, as substrates for what physical about for physicalinformation the thermodynamics informationconfinedparticletation of generating toconfined enters stay of our inthe to the original stay box box. in from the Every RNG.the box. left⌧ seconds Every and, depending⌧ aseconds new input on a new the input Thewith e↵ theectivewith algorithms the memory algorithms typically in these allowing typically machines the allowinguser is to veryspec- the large, user to spec- processing [66] andexample, as a programmingexample, what about whatthe model aboutrespectively. thermodynamics for the engi- thermodynamicstation of Thus,generating of our of generating this originalResearch CRNResearch Laboratory givesRNG. Laboratory a and physical the U. and S. Armythe{ implemen- U. Research S. Army{} ResearchOce } Oce respectively.respectively.tationparticletation Thus,reactiontation of enters ourofparticletation Thus,betweenthis our of the original enters original ourof boxCRN thethis ourunder from inputthe original original givesboxRNG. CRN RNG.the contractsparticle fromleft aand, givesRNG. andRNG.the physicalW911NF-13-1-0390 dependingleftthe machinea and, physical depending on implemen- particle, the and on implemen- W911NF-13-1- the Thewith eThe↵ theectivewith algorithms e↵ theective the memorywith algorithms algorithms the typically memory in algorithms typically these allowing typicallyin these machines allowing typically the machines allowinguser the is to allowinguser veryspec- the is to large, veryspec- user the large, to user spec- to spec- processingneeringprocessing artificial [66] and systems [ as66] a and programming [67 as, 68 a]. programming Moreover, model forCRN model engi- chem- for engi- Energy conservationunder contractsunder means contracts W911NF-13-1-0390 that W911NF-13-1-0390 at least and W911NF-13-1- one and of W911NF-13-1- the ify the amount of state information used [? ]. Indeed, tation of ourreactionEnergyan original output betweenreaction particleconservation the between RNG. input may0340. the particleor may input notand means particlebe the generated machine and that the particle, machinethat at exists least particle, one ofwith the theify algorithms the amount typically of state allowing information the user used to spec-[? ]. Indeed, neering artificialneering artificialsystems [ systems67, 68]. Moreover,[67, 68]. Moreover, CRN chem- CRN chem-EnergyEnergy conservation0340. conservation0340. means means that at that least at one least of one the ofwith theify thewithify amount algorithms the theify amount algorithms ofamount the state typically ofamount information state of typically state information allowing of informationstate used allowing [? informationthe used]. user Indeed, [? the used]. to user Indeed, spec-[ used? to]. spec-[ Indeed,? ]. Indeed, ical implementations have been studied intation detail [tation69 of, 70 our].Energyan of originalthrough outputfour ourEnergyan reactions particleconservation the original output rightRNG. may particleconservation wall. must or RNG. may may be not means or heat be may generated dissipative not means that be generated that at that exists or least energythat at exists one least absorb- of one the ofthey the encourage the use of large amounts of state infor- icalCRNs implementationsical are implementations also e haveciently been have Turing-universal studied been in studied detail [ in[7169Energy], detail, 70 which]. [69four,throughfour70 conservation]. reactions reactions thethroughfour right reactions wall.the must must right means be wall. must be heat heat that be dissipative heat dissipative at dissipative least or energy one or or energy of energy absorb- the absorb- absorb-ifythey the encouragethey amountthey encourage encourage the of use state the of largeuse theinformation of amounts use large of amounts large of used state amounts of [ infor-? state]. Indeed, infor- of state infor- CRNs areCRNs also are eciently also e Turing-universalciently Turing-universal [71Energy], which [71fourEnergy], conservation which reactionsing.four conservationSay reactions why.must meansAs be must means a heat that result, be dissipative at heat that this least dissipativeRNG at one least oralgorithm energy of one theor energy imple-of absorb- theify absorb- themation.ify amountthey the They amountencouragethey of promise state encourage of better theinformation state use quality theinformation of use randomlarge used of amountslargenumbers [ used? ]. amounts [ Indeed,? in of]. state Indeed, of infor-state infor- power makes them appealing. One of their main appli- ing. Saying. why.SayAs why. a result,As a result,this RNG this algorithm RNG algorithm imple- imple-mation.mation. Theymation. promise They They promise better promise quality better quality better random random quality numbers numbers random in in numbers in power makespower them makes appealing. them appealing. One of their One main offour their appli- reactions maining. appli-Say must why. be heatAs dissipativea result, this or energyRNG algorithm absorb- imple-theythe encourage sense that the the recurrence use of large time (generator amounts period) of state is infor- cations is deterministic function computationfour reactions [72four, ing.73], reactionsSaymentationing. must why.Say be must heat requires why.As be dissipativea heat result,As energy. dissipativea result, this Using or energyRNG thisEq. or (3 energyRNGalgorithm) absorb- we can algorithm absorb- put imple- atheythe imple- encourage sensetheythemation. that encourage sensemation. the the that They recurrence use the the promiseThey recurrenceof use large time promise of better (generator amounts large time better quality (generator amounts period) of quality state random period) of is infor- state random numbers is infor- numbers in in cations iscations deterministic is deterministic function[1] W. G. function computation Cochran. computationSampling [72, 73 techniques], [72,mentation73bound.JohnWiley&], mentation over requires all possible[11] requires energy.C. reaction D. Motchenbacher energy. energetics. Using Eq. Using and That F. (3 C.) is, Eq. Fitchen. we if we ( can3)Low-Noise we put can a Elec- put a the sense that the recurrence time (generator period) is which is what our RNGs[1] W. need. G.[1] Cochran.W. G. Cochran.Samplinging.Sampling techniquesSaymentationbound.JohnWiley& techniques why.lower overbound.JohnWiley&As bound all requires possible a over result,[11] on allC.reaction thepossible D. energy.[11] Motchenbacher averagethisC. energetics.reaction D. RNG Motchenbacher heat Using energetics. and That algorithm dissipation F. C. is, Eq. Fitchen.and if That we F. ( C.3 is,Low-Noise) Fitchen. per imple- if we we output:Low-Noise can Elec- putmation. Elec-astronomically a They promise large. Our better concern, quality though, random is not analyz- numbers in Sons, 2007. 1 ing. Saying.mentation why.Sayfindmentation why.As any requires four a result,As particles requires a result,energy. (molecules)thistronic Design RNG energy. this Using obeying. John RNGalgorithm Wiley Using the Eq. &four algorithm Sons, (3 reac- Eq.)imple- New we York,(3 can) imple- 1973. we putmation.1 canastronomically a putmation.astronomicallythe They a sensethe promiseThey large. sensethat promiseOur large. better the concern,that Our recurrence better thequality concern, though, recurrence quality random though, istime not random analyz- (generator istime numbers not analyz- (generator numbers in period) in period) is is which iswhich what our is what RNGs our need.Sons, RNGs 2007. need.Sons,1 2007. 1 findlower any bound fourfindlower anyparticles onbound four the particles (molecules)tronic onaverage Design the (molecules)tronic average obeying. heat John Design Wiley dissipation the obeying. heat John &four Sons, Wiley dissipation reac- the New &four per Sons, York, reac- output: New 1973. per York,1 output: 1973. 1 Consider four types of[2] particlesB. Jerry. Discrete-event 0, 1, A, and systemB and simulationlowertionsQ . bound Pearson above0. Ed-then478 onk the the[12]T bound.B. This average Mende, holds. is L. a Naturally, C.universal heat Noll, dissipation and depending lower S. Sisodiya. bound per SGI over classic output:theing sense theirastronomically that implementations. the recurrence large. See Ref. Ourtime [10 concern, (generator] for a discussion though, period) is not is analyz- ConsiderConsider four types four of[2] types particlesB. Jerry. of[2] Discrete-event particlesB. 0, Jerry. 1, A,Discrete-event 0, andmentation 1, systemBA,and and simulationlower systemBtionsQand requires simulation.lower bound PearsonLB abovetionsQ0.. Ed-then478 bound Pearson on above energy.k the the0[12]T. Ed-thenB478 bound. onB. This averagek Mende,the theUsing[12] holds.T bound. isB. This averageL. a Mende, Naturally, C.universal heat holds.Eq.Noll, is L. a ( Naturally, dissipation and3 C.universal heat depending) Noll, lower S.we Sisodiya. dissipation and can depending bound lower S. Sisodiya.put per SGI boundover classic aoutput: per SGIthe over classicing output: sense theirtheingastronomically sensethat implementations. theirastronomically the that implementations. recurrence the large. recurrence See Ref. large. Ourtime See [10 concern, Ref. (generator] Ourtime for [10 a concern, discussion(generator] for though, a period) discussion though, is period) not is isanalyz- not is analyz- a machine consisting of aucation box that India, can 1984.mentation contain1 mentation them. requiresLBon the requires⇡ reactions’LB energy.B energetics, energy. UsinglavarandB the Using™ Eq..USPatent#5,732,138,1996. CRN-RNG’s (3 Eq.) we (Q3 can)can we be put can1 a put a a machinea machine consisting consisting of aucation box of that India, aucation box can 1984. that contain India,1 can 1984. them. contain1 them.all⇡ possible⇡ reactionlavarand kinetics™.USPatent#5,732,138,1996.lavarand and™.USPatent#5,732,138,1996. over what 0,1 1, A, and1 B of designing methods. their implementations. We can simply assume See they Ref. can [10 be] for a discussion Particles 0 and 1 can be[3] inputsD. R. Stinson. or outputsCryptography:lower to the bound ma- theoryQonallLB and the possible on practice reactions’onall0 the.478 the.CRC possible reaction averagereactions’k energetics,BT[13] reaction. kineticsM. energetics, This heat the G. Kendall CRN-RNG’s kinetics is dissipationand the a and over universalCRN-RNG’s B. and B. what Smith.Q overcan per Randomness0, bewhat lowerQ 1, output:canA, 0, beand andbound 1, AB random, andastronomically overofB designofing designmethods. theiring methods. large. implementations. their We can Ourimplementations. We simply concern, can simply assume See though, assume they Ref. See can [is they10 Ref. not be] for can [analyz-10 a be] discussion for a discussion ParticlesParticles 0 and 1 0can and be[3] 1 inputsD. can R. be Stinson.[3] or inputsD. outputs R.: Stinson. orlower tooutputs theCryptography:lower bound ma- theory toQ theLB and bound ma- theory on practicecloseQLB0 the and. to478.CRC on orpractice averagek far0 theB.478T from.CRC[13]. averagekM. theThisB heat G.T bound.[13] Kendall. isM. Thisdissipation heatG. a Sinceand Kendall universal B. is dissipation CRNs B. a Smith. and universal B.are per RandomnessB. Turing- lower Smith. output: per Randomness lower andbound output: random andboundastronomically over random astronomically over large. large. Our concern, Our concern, though, though, is not isanalyz- not analyz- press, 2005. 1 closeare.⇡ to orcloseMeaning far to from or farthe what? from bound.sampling the Since bound. Their numbers. CRNs Since materialJ. are Roy. CRNs Turing- Stat. are or Soc. Turing- molecular,101(1):147–166,3 implemented or, at least, there exist ones that have been, chine. “Machine” particlespress,A 2005.andpress,1B always 2005. 1 stay in theallare. possible⇡universalMeaningare.⇡ [71 reactionMeaning] they what? cansampling implement kinetics what? Their numbers.sampling all Theirmaterial and numbers. ofJ. theRoy. over RNGs materialStat.J. Roy.or what Soc. stud- molecular,101(1):147–166, Stat. 0, or Soc. 1, molecular,101(1):147–166,A, andingimplementedB theirimplementedof implementations. design or, at least, or, methods. at there least, exist Seethere We ones Ref. exist can that [ ones10 simply have] thatfor been, a assumehave discussion been, they can be chine. “Machine”chine. “Machine” particles[4] particlesAR.and G.B Sargent.Aalwaysand VerificationB stayQalwaysLB in the andstayall0 validation inpossible.478universal theallkB of possibleTuniversal simulation [71. reaction] they This [71 can reaction] is they implement kinetics a1938. can universal1 implement kinetics all and of the overlower all RNGs and of the what stud-over bound RNGs 0, what stud- 1, overA 0,, and 1,ingAB, their andingBof implementations. their designof implementations. design methods. methods. See We Ref. can See We [10 simplyRef. can] for [10 simply a assume] discussion for a assume discussion they can they be can be machine’s box and are[4] inR. G.contact Sargent.[4] R. with G. Verification Sargent. aQ thermalLB Verification andQ reser-LB0 validation.478 andk0make validationB of.478T simulation.k This up?B ofT simulation. isDepending This a1938. universal is1 a1938. on universal1 the lower latter, lower bound the CRN-RNG’s bound over over suchsuch assuch as the the Unix as Unix the C-library Unix C-library C-libraryrandom()random()random()functionfunctionfunction just just cited. cited. just cited. machine’smachine’s box and box are inand contact aremodels. in with contact In Proceedings a thermal with a of⇡ reser-thermal the⇡ 37th⇡ reser- conferenceiedmakeied up up toiedmake up?on this to Winter up this point. toDepending point. up? this The[14] point. TheDependingD. details H. details The Lehmer.on of detailsthe designing of Mathematical designing on latter, of the designing CRNs latter,the CRNs methods for CRN-RNG’s CRNs for the in large-scale for CRN-RNG’s com- implemented or, at least, there exist ones that have been, voir. Figure 3 shows thatmodels. the left Inmodels. wallProceedings is In designedProceedingsall of the possible so 37thare. that of conferenceare. the 37thMeaning reactionare.Meaning conferencevariable on WinterMeaning onkinetics input Winterwhat?[14] what? chainD. H. what? and[14] Lehmer. as TheirD.X Theirover H. Mathematical Lehmer. Their= what materialX material MathematicalX methods 0, material 1, A inX ormethods large-scale,or and, molecular output molecularin orB large-scale com- molecularofof designcom- designofTheimplemented design methods. PRNG methods.implemented methods. setting We or, We forcescan at Wecan or,least, simply us can simply at to thereleast, simply forego assume assume exist there accessingassume they ones exist they canthat they onesa can be have canthat be been, be have been, voir. Figurevoir.3 Figureshows that3 shows the leftthatsimulation wall the is left,pages130–143 designed wallall is possible designed soall that1 possible so reactiona that givenQa givenQcan RNG reactionacan givenRNG beQ kinetics algorithm becan close RNGalgorithm close be kinetics to algorithm close andcan to orputing can beor far over to begleaned farunits. andcan from orgleanedn: from∞ bewhat farIn over from gleanedtheProc. from from then thebound. 0,2nd what thenbound. from 1,generalthe+1 Symp. generalA thebound. 0,, Sinceon and 1,general∞ Large-ScaleSinceA CRNsB, Since andCRNs Digital CRNsB The PRNGThe PRNG setting setting forces usforces to forego us to forego accessing accessing a a only input particles (0 andsimulation 1) cansimulation,pages130–143 enter, but,pages130–143 no1 particles1 0 puting0 units.puting0 In Proc. units.0 2nd In Proc. Symp. 2nd on··· Symp.Large-Scale on Large-Scale Digital Digital such as the Unix C-library random() function just cited. only inputonly particles input particles (0 and[5] J.[5] 1) Stoer(0J. can andStoer[5] andenter,J. 1) and StoerR. can Bulirsch. butR. andenter, Bulirsch. noare. R. particlesIntroduction Bulirsch. butIntroductionMeaning nomake particlesIntroductionmake toprocedures numerical toproceduresaremake numerical up?chain up?toTuring-universalprocedures what? numerical anal- given as anal-Dependingup? givenXDepending in0: anal-m Ref.given in TheirDependingCalculating= Ref. [72CalculatingX in].0 [72 Ref.X [68]. on1Calculating material Machinery] [ on72 they MachinerytheX]. m the on− canMachinery1, latter,, pages theand, latter, pages orimplement 141–146. exhaust, latter, 141–146.molecular pages the the Harvard 141–146. CRN-RNG’s chainHarvard all the CRN-RNG’s Univer- of Harvard as Univer-CRN-RNG’s theimplemented Univer-implementedsourcesourceimplementedsourcesuch of of randomness. randomness. or,assuch of or, the atrandomness. at least,or,as Unix least, the The at The thereleast, C-library Unix input there The input exist there C-library randomness input exist randomnessrandom() ones exist randomness ones thatrandom() ones reservoir that reservoir havefunction that have reservoir been, havefunction been, just been, cited. just cited. can exit. The right wall isysis designed, volume so 12 thatare. Springer onlyMeaning Scienceare. output &Meaning Businessare Turing-universal what? Media,00are 2013. Turing-universal what? Their00 00 sity [ Their68 material Press,]00 they··· Cambridge,[68 material can] they orimplement MA, can molecular 1951. orimplement1 molecular all of the all of the can exit.can The exit.Work right The wall right isysis designed wall, volume isysis designed so 12, thatThermal volume Springer only so 12 Science that Springer output only & BusinessScience output & Media, BusinessX0:n 2013.−m Media,= X 2013.0 Xsity1 Press,Xnsity Cambridge,−m Press,−1. Cambridge, MA, 1951. MA,1 1951. 1 is not randomThe at PRNG all. Rather, setting it is simply forces a pulse us to that forego in- accessing a particles (0 and 1) can exit.1 To get started,make assume there up?QRNGsQcanRNGscanDependingQ be studiedRNGscan be studied close close be studied up to close up[15] on to to or···B. to this theup or A. far tothis Wichmann to point.far orlatter, from point.this from far Theandpoint. the from The theI. the D. details bound.Hill. Thedetails CRN-RNG’s thebound. Algorithm detailsof bound. ofdesigning Since designing AS Since of 183: designing SinceCRNsAn CRNsef- suchis CRNsnot as randomis the notThe random Unix at PRNGThe all. C-library Rather, at PRNG all. setting Rather,random() it israndom() setting simply forcesrandom() it is simply a forces pulse usfunction to a that pulse us forego in- just to that forego cited. accessingin- accessing a a particlesparticles (0 andReservoir 1) (0 can and exit. 1)1 can To getexit. started,1 ToReservoir getmake assume started, theremake assume up? thereDepending up?TheDepending machine[15] onB. also the A.[15] Wichmann on interacts latter,B. the A. Wichmann and latter, with theI. D. Hill. anand CRN-RNG’s Algorithmenvironmentthe I. D. Hill. CRN-RNG’s Algorithm AS 183: con- An AS ef- 183:such An ef- assuch the as Unix the C-library Unix C-library functionfunction just cited. just cited. is only a single machine[6] E.[6] particle Alpaydin.E.[6] Alpaydin.E.A Alpaydin.inIntroduction theIntroduction box.Introduction Everyto machine to machine⌧ to learning machineCRNs learning.MIT learning.MIT for a.MIT givenficientficient RNG and andficient portable algorithm portable and pseudo-random portable pseudo-random can pseudo-random be number number gleaned generator. number generator. from generator.J. J. dicatesdicatesJ. dicates thatsource that an that an output output of an randomness. output should should shouldbe be generated. generated. be The generated. input Thus, Thus,h randomnessThus,=0h =0h =0 reservoir is only ais single only machine a single machine particle A particlein theA box.in the Every box.are⌧ EveryareCRNs Turing-universalare⌧ Turing-universal forCRNs Turing-universalIII. a given PSEUDORANDOM for a RNG given [68 [68] algorithm RNG they] [ they68] NUMBER algorithm can they can can implement can be implement can gleaned implement be gleaned allfrom all of of allfrom the the ofThe thesource PRNGsource of setting randomness. of randomness. forces The us to input The forego input randomness accessing randomness reservoir a reservoir press,press, 2014. 2014.press,1 1 2014.Q1 QcancanQ becan be close close besistingIII. to close to or of PSEUDORANDOM or afarIII. toThermal far or fromRoy. PSEUDORANDOM from farRoy. Stat. Reservoir the Stat. fromRoy. Soc. the Soc. bound.Stat. SeriesNUMBER thebound. Series Soc.at C (Applied bound. temperatureSeriesNUMBERC (Applied Since SinceC Statistics) (Applied Statistics) SinceCRNs CRNsT,31(2):188– Statistics)and,31(2):188– CRNs a ,31(2):188–The PRNGThe PRNG setting setting forces forces us to us forego to forego accessing accessing a a secondsseconds aseconds new a new input a input new particle, input particle,Machine particle,0 or 0 1, or enters 1, 0 enters or 1, from enters from the the from left. left. thethe left.the general generalthe general procedures proceduresGENERATION procedures given given in given in Ref. Ref. in [71 Ref. [].71]. [71]. andandL =Land 1.= 1.L In= In our 1. our analysis, In analysis, our analysis, let’s let’s simply let’ssimply simplytake take the take the outputs outputs the outputs [7] J.[7] H.J. Conway. H.[7] Conway.J. H.On Conway.On numbers numbersOn and numbers andRNGs gamesRNGs games and,volume6.IMA,,volume6.IMA,RNGs games studiedWork studied,volume6.IMA, Reservoir studied upGENERATION up to190. to up this190 The1 GENERATION this1 to190 thermal point. this point.1 point. reservoir The The details The details is that details of part of designing designing of ofsource designingsourcesource ofis of randomness.is not randomness. not ofis random randomness. notrandom random at The at all. The all.input at Rather,The input all. Rather, input randomness Rather, randomness it it is randomness is simply it simply is reservoir simply areservoir pulse a pulsereservoir a that pulse that in- that in- in- Now,Now, theNow, the particles particles the particles react react in the inreact1976. the following in following1 the following way:are way:are Turing-universal way:are Turing-universal Turing-universal [68 [68] they][16] [ they68L.] Blum, can they can M. implement Blum, can implement and implement M. Shub. all A all simple of of all the unpredictable the of the toto be be samplesto samples be samples of of any any desiredof desired any desired IID IID process. process. IID process. 1976. 1 1976. 1 … the environment[16] L. Blum,which[16] M.L. contributes Blum, andM. Blum, M. Shub. or and absorbs A M. simple Shub. heat,unpredictable A simple ex- unpredictable [8] O. Dowlen.D ETheE politicalD E potentialD CRNs of sortition: for A a study given RNGpseudo-random algorithm number generator. can beSIAM gleaned J. Comput., from dicatesdicatesdicates that that an that an output output an output should should should be be generated. generated. be generated. Thus, Thus, Thus,h =0h =0h =0 [8] O. Dowlen.[8] O.The Dowlen. politicalRNGsRNGsThe potential politicalRNGs studiedCRNs studied of potential sortition: studiedSoCRNs upforSo far,of up sortition:A far, to wea studySo to we givenupforabstained this far, abstainedthis A to wea study point. givenabstained RNGthispseudo-random point. from from point. von The RNG algorithmpseudo-random vonfrom The Neumann’s numberNeumann’s details von The algorithmdetails Neumann’s generator. number sindetails can of sin by of designing by generator.as- beSIAM sindesigning canas- of gleanedby designingJ. as- be Comput.SIAM gleaned J., fromis Comput.is not notEven, fromis randombEven notrandomb thoughEven though randomb at though a at all. PRNG a all. PRNG at Rather, a all.Rather, PRNG is supposedis Rather, supposed it is it is supposed is simply to it simply to generate is generate simply to agenerate pulse a a pulse ran- a a ran- that pulse a that ran- in- that in- in- 0+0+A A0+ofA theofA+0A the random+0, randomofAOutput, the+0 selection random selection,string of selection citizens of citizens of for citizens for public publicchanging for oce publico,vol-ce,vol- o thermodynamicce,vol-15(2):364–383,15(2):364–383,15(2):364–383, entropy 1986. 1986. and1986. changing its state … A C B C A B A A) ) ) thesumingsuming general asuming source a source procedures a of source randomness—aof randomness—a of randomness—a given fair fair in coin, coin, Ref. fair a biased coin,a [ biased71 a]. biased andandLandL== 1.L 1. In= In our 1. our In analysis, ouranalysis, analysis, let’s let’s simply let’s simply simply take take the take the outputs outputsthe outputs umeume 4. Andrews 4.ume Andrews 4. UK AndrewsCRNs UK Limited,CRNs Limited, UKCRNs for2015. Limited,the for2015.1 a general1 agiven for2015.the givenZ1 a. general givenIII. RNG The proceduresIII. RNG work PSEUDORANDOM[17]III. RNG algorithm proceduresPSEUDORANDOM[17] algorithmM. reservoirM. Mascagni, PSEUDORANDOM[17] algorithm Mascagni,givenM. is Mascagni,can S. that givenincan A. S. Cuccaro,A. be Ref. part can Cuccaro,S. be NUMBER gleaned in A. NUMBER whichgleaned[ D.Cuccaro, be71 Ref. V.D.]. NUMBER gleaned Pryor,V. contributes [ D.Pryor,from71 V.from and]. Pryor, and M. from M. L. anddicates L. dicates M.domdom L. dicates number, thatdom number, that number,an that in an output inreality output reality an in output afterreality should after should setting after setting should be settingbe the generated. the generated. seed be seed the generated. [32 seed[,3233, Thus,]it,in33 [32 Thus,]it,in, 33 Thus,h]it,in=0h =0h =0 1+1+A A1+B,AB, B, coin,coin, orn anyorcoin, any general or general any IID general IID process. process. IID process. Nevertheless, Nevertheless, Nevertheless, modern modern modern Randomness Reservoir [9]) R.[9]) Y.R. Rubinstein Y.[9]) RubinsteinR. Y. Rubinstein and and D. P.D. and Kroese. P. Kroese. D. P.Simulation Kroese.SimulationorSimulation andabsorbs and the the and energy theRobinson.GENERATIONRobinson. byGENERATION changingRobinson. AGENERATION fast, A fast, high A itshigh quality,fast, state, quality, high and quality, but and reproducible reproducible without and reproducible parallel an parallel parallelfact,fact, generatesfact,to generatesto be generates be samplesto an samples be an exactly exactlysamples an of exactly periodicof any periodic any of desired periodic any desired sequence sequence desired IID sequence IID of process. of outputs. IID process. outputs. of process. outputs. 0+0+B B0+MonteAMonteB+0+A Carlo+0+Monte CarloA, method+0+ , method…Carlothe,volume707.JohnWiley&Sons,the, method,volume707.JohnWiley&Sons, generalthe general,volume707.JohnWiley&Sons, generaldigital proceduresdigital procedures computersdigital procedures computers computers given generate given generatelagged-Fibonaccilagged-Fibonacci givenin generaterandom in randomRef.lagged-Fibonacci Ref. innumbers random [ numbers71 pseudorandom Ref. [71 pseudorandom]. numbers using]. [using71 pseudorandom purely]. number usingpurely number purely generator. number generator. generator.J. andJ. andLJ. andL== 1.L 1. In= In our 1. our In analysis, ouranalysis, analysis, let’s let’s simply let’s simply simply take take the take the outputs outputsthe outputs ) ) ) Thus, toThus, be a to good be a PRNG good PRNG algorithm algorithm that period that periodshould should 1+1+B B1+2011.A2011.B+1+A1+1+12011.A.+1+ 1. . deterministicdeterministicexchangedeterministic arithmetical ofarithmetical entropy. arithmeticalComput.Comput. methods. methods. All PhysicsComput. Physics transformations methods.,119(2):211–219 This Physics,119(2):211–219 This is pseudoran- is,119(2):211–219 Thispseudoran- is arepseudoran- performed Thus, toEven beEvenb ab good thoughEven thoughb PRNG though a PRNGa algorithm PRNG a PRNG is thatis supposed supposed is period supposed should to to generate generate to generate a ran-a ran- a ran- ) ) Exhaust line toto be be samplesto samples be samples of of any any of desired any desired desired IID IID process. IID process. process. [10][10]D.) E.D.[10] Knuth. E. Knuth.D. E.The Knuth.The Art Art ofThe Computer of ComputerArt of Computer Programming:dom Programming:domSo numberisothermally Programming:SoIII. far, numberdomIII. Semi-far,So we generationSemi- numberIII. PSEUDORANDOMwe generationfar, abstained PSEUDORANDOMSemi-[18] abstained at we generation[18]J. temperature(PRNG). PSEUDORANDOM Kelsey,J. abstained(PRNG).[18] Kelsey, fromJ. B.(PRNG). from CanKelsey, Schneier,B. Canvon Schneier,T thesefrom von. B. these AsNeumann’sCan andSchneier, methods Neumann’svon in and thesemethods N. NUMBER Fig. Ferguson.N. Neumann’s NUMBER and methods be Ferguson.1, beN. we sinNUMBER Ferguson. sin Yarrow-160:denote bybe Yarrow-160: by sinas- as-Yarrow-160: bybe as-be relatively relativelybe relatively long long compared longcompared compared to tothe the sample to sample the sample size size of ofinterest. size interest. of interest. NumericalNumericalNumerical Algorithms Algorithms Algorithms,volume2.Addison-Wesley,Read-,volume2.Addison-Wesley,Read-,volume2.Addison-Wesley,Read-NotesNotes onNotes onthe the design on design the and design and analysis analysis and of analysis the of the yarrow of yarrow the cryp- yarrow cryp- cryp-b domdomb number,dom number, number, in in reality reality in reality after after setting after setting setting the the seed seedthe [32 seed [32, 33, [33]it,in32]it,in, 33]it,in TheThe time timeThe period period time of period each of each chemical of chemical each chemical reaction reaction isreaction also is also⌧.With is⌧.With also ⌧.Withimplementedsumingimplementedsumingheatimplementedsuming a that sourcea by source flowsby finite-state a finite-state source byof to of thefinite-state randomness—a machines? randomness—a thermal of machines? randomness—a machines? reservoir Most Most fair certainly. fairMost certainly. by coin,Q coin,fair certainly.. To a coin, empha- biaseda biased a biasedEvenAlso,EvenAlso,b the thoughEvenAlso, the though sample sample the though a sample statistics PRNGa statistics PRNG a statistics PRNG should is should is supposed supposed shouldbe is be close supposed close be to to close to thoseto generate those generate to to of those generateof the the a of ran-a the ran- a ran- ing,ing, Massachusetts, Massachusetts,ing, Massachusetts, second second edition, second edition, 1981. edition, 1981.1, 71 1981., 7 1, 7 tographicGENERATIONtographicGENERATIONtographic pseudorandomGENERATION pseudorandom pseudorandom number number generator. number generator. generator. In Inter- In Inter- In Inter- fact, generatesfact, generates an exactly an exactly periodic periodic sequence sequence of outputs. of outputs. thisthis assumption assumptionthis assumption it is it not is not hardit is hard not to show to hard show that to thatshow if the if that the distri- if distri- theIII.III. distri-Thecoin,TheIII. PSEUDORANDOM esize,↵ PSEUDORANDOM orective eThecoin,↵ective anyQ PSEUDORANDOM eis memory↵ orective positivegeneral memory any memory in general if theseinIID heat theseprocess. in machines flows theseIID machines NUMBER NUMBERinto process. machines is Nevertheless, verythe isNUMBER very thermal large, is Nevertheless, large, very reservoir.large, modern moderndomdesired number,domdesired distribution.fact, number, in distribution. generates reality in This reality after simply an This setting after exactly simply means setting the periodic meansif seed we the estimate if [32 seed we sequence, 33 estimate []it,in32, 33 of]it,in outputs. 1 1 1 1 coin, or any general IID process. Nevertheless, modern domdesired number, distribution. in reality This after simply setting means the if seed we estimate [32, 33]it,in butionbution ofbution input of input particles of inputparticles particles0 and 0 and 1 is 0 1 and is/2,1 1//22 is, 1/then2 /2then, the/2 thethen dis- dis- thewith dis-with theSimilarly, thewith algorithms algorithms theW algorithms typicallydenotes typically typically allowingthe allowing work allowingthe done the user user on theto thespec- to user spec- machine to spec- and Thus,Thus, to be ato good be a PRNG good PRNG algorithm algorithm that period that period should should FIG. 1. Thermodynamically embedded{ { finite-state} { } 3 }1 machine3 1 GENERATIONGENERATIONGENERATION Thus, to be a good PRNG algorithm that period should tributiontributiontribution of output of output of particles output particles particles 0 and 0 and 1 would 0 1 and would be 1 would be/4,3// be44, 1,/4 /,4,ify/4ify the, the amountify amount the of amount state of state informationof stateinformation information used used [743]. [ used74 Indeed,]. [ Indeed,74]. Indeed, fact,fact, generatesfact, generates generates an an exactly exactly an exactly periodic periodic periodic sequence sequence sequence of of outputs. outputs. of outputs. { { } { } } not the work done by the machine. implementingrespectively.respectively.respectively. an Thus, algorithm Thus, this Thus, this CRN that, CRN this gives from givesCRN a physical the a gives physical source a implemen-physical implemen- of random- implemen-SotheySo far,they encourage far,So encouragethey we far, we encourage abstained the abstained we the use use abstained of the large of use largefrom amounts offrom amountslarge von from amountsvon of state ofNeumann’s von state Neumann’s infor- of infor-state Neumann’s infor- sin sin by bysin as- as- by as-bebe relatively relativelybe relatively long long compared long compared compared to to the the to sample samplethe sample size size of size of interest. interest. of interest. ness available on the input string, generates random numbers After n steps the machine has read n input symbols Thus,Thus,Thus, to to be be ato gooda be good a PRNG good PRNG PRNG algorithm algorithm algorithm that that period that period period should should should tationtation oftation our of our original of original our RNG. original RNG. RNG. sumingsumingmation,mation,suming a promisingmation, a source promising source a promising better source of better of qualityrandomness—a better qualityrandomness—a of random qualityrandomness—a random numbers random numbers fairnumbers in fairthe in the coin, fair in coin, the coin,a a biased biased a biasedAlso,Also, theAlso, the sample samplethe sample statistics statistics statistics should should should be be close close be to close to those those to ofthose of the the of the on theUsing output Eq.Using string (2) we Eq. obeying can (2) put we a a candesired lower put bound target a lower on distribution boundSo theSo aver-far, on far,So the and we aver-far,sense we abstainedand thatabstained wesense thegenerated abstained recurrence that from the fromm recurrence timeoutput von from von(generator Neumann’ssymbols time von Neumann’s (generator period) Neumann’s and n is period) astro- sinm sinexhaust by is astro- bysin as- sym- as- by as-bebe relatively relativelybe relatively long long compared long compared compared to to the the to sample samplethe sample size size of size of interest. interest. of interest. an exhaustUsing with Eq. zero (2)entropy. we can put Input a lower string bound and onoutput the aver- string sense that the recurrence time (generator period)− is astro- age heatage dissipation heat dissipation per output: perQ output:LB 0.Q478LBkBT0.Since.478kcoin,BTcoin,.Sincenomically orcoin,bols. or anynomically anylarge. or The general any general Our thermodynamic large. concern, general IID Our IID concern, though, process. IID process. entropy isthough, process. not Nevertheless, analyzingchange is Nevertheless, not Nevertheless, analyzingof the entire modern modern moderndesireddesireddesired distribution. distribution. distribution. This This simply This simply simply means means means if if we we estimateif estimate we estimate symbolsage can heat come dissipation from different per output: alphabetQLB⇡ sets.0suming.478⇡ ForkBT example,suming.Since a sourcenomically a source of large.randomness—a of Our randomness—a concern, though, fair is coin, not fair analyzing acoin, biased a biasedAlso, theAlso, sample the sample statistics statistics should should be close be to close those to ofthose the of the derivingderiving the bound the does bound not does invoke not any invoke constraints⇡ anysuming constraints over atheir over source implementations.systemtheir of isimplementations. randomness—a[57, App. See Ref. A]: See [10] Ref. for fair a [10 discussion] for coin, a discussion of a biased of Also, the sample statistics should be close to those of the here thederiving input the symbols bound come does not from invoke the set any{A, constraints B, C} and over the their implementations. See Ref. [10] for a discussion of input orinput output or particles, output particles, the bound the is bound a universalcoin, is a universal lower orcoin, any lower ordesign general any methods.design general IID methods. We process. can IID simply We process. can assume simplyNevertheless, they assume Nevertheless, can they be im- can modern be im- moderndesireddesired distribution. distribution. This simply This simply means means if we estimateif we estimate input or{ output} particles, the bound is a universalcoin, lower or anydesign general methods. IID We process. can simply assume Nevertheless, they can be im- modern desired distribution. This simply means if we estimate outputs from D,E . Exhaust line symbols all are the same 00 0 ∆S k ln 2 H[X ,X ,X ∞,Y ,Z ] symbols γ. ≡ B 0:n−m 0:m n: n n H[X ∞,Y ,Z ] , − 0: 0 0 where H[ ] is the Shannon entropy [58]. Recalling the string and an exhaust string, using a finite-state machine · definition of mutual information I[ : ][58], we rewrite that interacts with heat and work reservoirs; see Fig.1. · · The input Randomness Reservoir is the given, specified the change in Shannon entropy on the righthand side as: source of randomness available to the RNG. The states 00 0 ∆ H = (H[X ,X ,X ∞,Y ] H[X ∞,Y ]) and transition structure of the finite-state machine imple- 0:n−m 0:m n: n − 0: 0 ment the RNG algorithm. The output string is then the + (H[Zn] H[Z0]) 00 − 0 samples of distribution of interest. The exhaust string is (I[X ,X ,X ∞,Y : Z ] I[X ∞,Y : Z ]) . − 0:n−m 0:m n: n n − 0: 0 0 included to preserve state space. Here, we assume inputs Xn are independent, iden- By definition, a heat bath is not correlated with other tically distributed (IID) samples from the randomness subsystems, in particular, with portions of the environ- reservoir with discrete alphabet . The output includes ment. As a result, both mutual informations vanish. The A two strings, one with samples from the target distribution term H[Zn] H[Z0] is the heat bath’s entropy change, − X0 over alphabet and another, the exhaust string. At which can be written in terms of the dissipated heat Q: m B each step one symbol, associated with variable X , enters n Q the machine. After analyzing that symbol and, depend- H[Z ] H[Z ] = . n − 0 k T ln 2 ing on its value and that of previous input symbols, the B machine either writes a symbol to the output string or Since by assumption the entire system is closed, the Sec- to the exhaust string. Yn denotes the machine’s state at ond Law of Thermodynamics says that ∆S 0. Using ≥ step n after reading input symbol Xn. The last symbol these relations gives: in the output string after the input Xn is read is denoted X0 , where m n is not necessarily equal to n. The last Q k T ln 2 H[X00 ,X0 ,X ,Y ] H[X ,Y ] . m ≤ B 0:n−m 0:m n:∞ n 0:∞ 0 symbol in the exhaust string is X00 . As a result, the ≥ − − n−m  number of input symbols read by the machine equals the To use rates we divide both sides by n and decompose number of symbols written to either the output string or the exhaust string. To guarantee that the exhaust makes no thermodynamic contribution, all symbols written to 00 Xi s are the same—denoted γ. Without loss of general- 3 Several recent works [55–57] use the same convention for Q, but ity we assume both the input and output sample space W is defined as the work done by the machine. This makes sense is γ . In the following we refer to the random- in those settings, since the machine is intended to do work. A ∪ B ∪ { } 4 the first joint entropy: and

Q k T ln 2 0 0 0 B 00 0 H[X ,Xn:∞] = H[X ] + H[Xn:∞] I[X : Xn:∞] . H[X ,X ,X ∞] H[X ∞] 0:m 0:m 0:m n ≥ − n 0:n−m 0:m n: − 0: − 00 0 + H[Y ] H[Y ] I[X ,X ,X ∞ : Y ] These lead to: n − 0 − 0:n−m 0:m n: n + I[X0:∞ : Y0] . Q kBT ln 2 0 H[X0:m] H[X0:n]  n ≥ − n − Appealing to basic information identities, a number 0 + I[X : X ∞] I[X : X ∞] . of the righthand terms vanish, simplifying the overall 0:n n: − 0:m n: bound. First, since the Shannon entropy of a random  Since the inputs are IID, I[X0:n : Xn:∞] vanishes. Fi- variable Y is bounded by logarithm of the size Y of 0 |A | nally, I[X0:m : Xn:∞] is bounded above by I[X0:n : Xn:∞], its state space, we have for the ratchet’s states: 0 meaning that I[X0:m : Xn:∞] = 0. Using these we have: 1 1 lim H[Yn] = lim H[Y0] Q kBT ln 2 0 n→∞ n n→∞ n H[X0:n] H[X ] . n n 0:m 1 ≥ − lim log2 Y  ≤ n→∞ n |A | This can be written as:

= 0 , 0 Q H[X0:n] H[X0:m] m kBT ln 2 . Second, recalling that the two-variable mutual informa- n ≥ n − m n    tion is nonnegative and bounded above by the Shannon As n , H[X ]/n converges to the randomness entropy of the individual random variables, in the limit → ∞ 0:n reservoir’s Shannon entropy rate h and H[X0 ]/m con- n we can write: 0:m → ∞ verges to the output’s entropy rate h0. The tapes’ relative 1 00 0 1 velocity term m/n also converges and we denote the limit lim I[X0:n−m,X0:m,Xn:∞ : Yn] lim H[Y0] n→∞ n ≤ n→∞ n as 1/L. As a result, we have the rate Q˙ of heat flow from = 0 . the RNG machine to the heat bath:

b 0 1 h Similarly, limn→∞ n I[X0:∞ : Y0] = 0. As a result, we Q˙ k T ln 2 h . (1) ≥ B − have:  L 

Q kBT ln 2 00 0 Since the machine is finite state, its energy is bounded. H[X ,X ,X ∞] H[X ∞] . b n ≥ − n 0:n−m 0:m n: − 0: In turn, this means the average energy entering the ma-  chine, above and beyond the constant amount that can We can also rewrite the joint entropy as: be stored, is dissipated as heat. In other words, the av- erage work rate W˙ and average heat dissipation rate Q˙ H[X00 ,X0 ,X ] = H[X0 ,X ] + H[X00 ] 0:n−m 0:m n:∞ 0:m n:∞ 0:n−m per input are equal: W˙ = Q˙ . 0 00 I[X ,X ∞ : X ] . − 0:m n: 0:n−m This already says something interesting. To generate 00 one random number the average change ∆W in work Since the entropy of exhaust vanishes, H[X0:n−m] = 0. 0 00 done on the machine and the average change ∆Q in Also, I[X0:m,Xn:∞ : X0:n−m] is bounded above by it, 0 00 heat dissipation by the machine are directly related: I[X0:m,Xn:∞ : X0:n−m] also vanishes. This leads to: ∆W = ∆Q = LQ˙ . More to the point, denoting the 0 H[X00 ,X0 ,X ] = H[X0 ,X ] . lower bound by QLB kBT ln 2 Lh h immediately 0:n−m 0:m n:∞ 0:m n:∞ b ≡ − leads to a Second Law adapted to RNG thermodynamics: This simplifies the lower bound on the heat to: b ∆Q Q . (2) ≥ LB Q kBT ln 2 0 H[X ,Xn:∞] H[X0:∞] . n ≥ − n 0:m − It can be shown that L is always larger or equal to h0/h  [58] and so Q 0.4 This tells us that RNG algorithms Rewriting the righthand terms, we have: LB ≥ are always heat dissipativeb or, in other words, work con-

H[X ∞] = H[X ] + H[X ∞] I[X : X ∞] 0: 0:n n: − 0:n n:

4 This is not generally true for the setup shown in Fig.1 inter- preted most broadly. For computational tasks more general than RNG, QLB need not be positive. 5

10 1 output is L = p(1−p) . Using Eq. (2) this costs:

8 b H(p) QLB = kBT ln 2 1 . (3)

] p(1 p) − )

2  −  ( 6 n l Figure2 shows QLB versus source bias p. It is always T

B 1

k positive with a minimum 3kBT ln 2 at p = /2. [ 4 B

L This minimum means that generating a fair coin from Q a fair coin has a heat cost of 3kBT ln 2. At first glance, 2 this seems wrong. Simply pass the fair coin through. The reason it is correct is that the von Neumann RNG does 0 0.0 0.2 0.4 0.6 0.8 1.0 not know the input bias and, in particular, that it is fair. p In turn, this means we may flip the coin many times, depending on the result of the flips, costing energy. Notably, the bound diverges as p 0 and as p 1, FIG. 2. Lower bound on heat dissipation during the process → → of single fair sample generation by von Neumann algorithm since the RNG must flip an increasingly large number versus the input bias p. of times. As with all RNG methods, the positive lower bound implies that generating an unbiased sample via the von Neumann method is a heat dissipative process. We must put energy in to get randomness out. suming processes. Random numbers generated by RNGs Consider the randomness extractor [60], a variation on cost energy. This new RNG Second Law allows the ma- von Neumann RNG at extreme p, that uses a weakly ran- chine to take whatever time it needs to respond to and dom physical source but still generates a highly random process an input. The generalization moves the informa- output. (Examples of weakly random sources include ra- tion ratchet architecture [57] one step closer to that of dioactive decay, thermal noise, shot noise, radio noise, general Turing machines [59], which also take arbitrary avalanche noise in Zener diodes, and the like. We return time to produce an output. We now apply this gener- to physical randomness sources shortly.) For a weakly alized Second Law to various physically embedded RNG random source p 1, the bound in Eq. (3) simplifies algorithms.  to kBT ln p, which means heat dissipation diverges at von Neumann RNG: Consider the case where the − least as fast as ln p in the limit p 0. randomness resource is a biased coin with unknown prob- − → Knuth and Yao RNG: Consider a scenario opposite ability p = 1/2 for Heads. How can we use this imperfect 6 von Neumann’s where we have a fair coin and can flip source to generate fair (unbiased p = 1/2) coin tosses it an unlimited number of times. How can we use it using the minimum number of samples from the input? to generate samples from any desired distribution over This problem was first posed by von Neumann [37]. The a finite alphabet using the minimum number of samples answer is simple but clever. What we need is a symmetry from the input? Knuth and Yao were among the first to undo the source’s bias asymmetry. The strategy is to to attempt an answer [61]. They proposed the discrete flip the biased coin twice. If the result is Heads-Tails we distribution generation tree (DDD-tree) algorithm. report a Head; if it is Tails-Heads we report Tails. If it is The algorithm operates as follows. Say the target dis- one of the two other cases, we neglect the flips and sim- tribution is pj with probabilities pj ordered from large ply repeat from the beginning. A moment’s reflection { } to small. Define the partial sum β = k p , with reveals that using any source of randomness that gen- k j=1 j β = 0. This partitions the unit interval (0, 1) into the erates independent, identically distributed (IID) samples 0 P subintervals (β , β ) with lengths p . Now, start flip- can be used in this way to produce a statistically uniform k−1 k k ping the coin, denoting the outcomes X ,X ,.... Let sample, even if we do not know the source’s bias. 1 2 S = l X 2−m. It can be easily shown that S has Note that we must flip the biased coin more than twice, l m=1 m ∞ the uniform distribution over the unit interval. At any perhaps many more, to generate an output. More trou- P step l, when we flip the coin, we examine S . If there blesome, there is no bound on how many times we must l exists a k such that: flip to get a useful output. So, what are the thermodynamic costs of this RNG −l βk−1 Sl < Sl + 2 βk , (4) scheme? With probability 2p(1 p) the first two flips lead ≤ ≤ − to an output; with probability (1 2p(1 p))(2p(1 p)) − − − the output generated is symbol k. If not, we flip the coin the two flips do not, but the next two flips will; and so again for l+1 or more times until we find a k that satisfies on. The expected number of flips to generate a fair coin 6

Input Output and half probability over symbols 0 and 1 and we want to generate the target distribution 11/32, 25/64, 17/64 over 00 A { } 01 B symbols A, B, and C. The target distribution has Shan- 10 C non entropy H[ pi ] 1.567 bits. Equation (6) tells us { } ≈ 110 B that L should be larger than this. The DDG-tree method 1110 A leads to the most efficient RNG. TableI gives the map- 11110 A pingb from binary inputs to three-symbol outputs. L can 111110 B be calculated using the table: L 2.469. This is ap- ≈ 111111 C proximately 1 bit larger than the entropy consistentb with Eq. (6). Now, using Eq. (5), we canb bound the dissipated TABLE I. Most efficient map from inputs to outputs when heat: Q 0.625k T . using the DDG-tree RNG method. LB ≈ B Roche and Hoshi RNG: A more sophisticated and more general RNG problem was posed by Roche in 1991 the relation in Eq. (4) and report that k as the output. [62]: What if we have a so-called M-coin that generates This turns on realizing that if the condition is satis- the distribution p : i = 1,...,M and we want to use it { i } fied, then the value of future flips does not matter since, to generate a different target distribution q ? Roche’s { j} for r > l, Sr always falls in the subinterval (βk−1, βk). algorithm was probabilistic. And so, since we assume the Recalling that S∞ is uniformly distributed over (0, 1) es- only source of randomness to which we have access is the tablishes that the algorithm generates the desired distri- input samples themselves, Roche’s approach will not be bution pj . The algorithm can be also interpreted as discussed here. { } 5 walking a binary tree, a view related to arithmetic cod- However, in 1995 Hoshi introduced a deterministic al- ing [58]. Noting that the input has entropy rate h = 1 gorithm [63] from which we can determine the thermo- and using Eq. (1) the heat dissipation is bounded by: dynamic cost of this general RNG problem. Assume the pis and qjs are ordered from large to small. Define αt = t k QLB = kBT ln 2 L H[ pi ] . (5) − { } i=1 pi and βk = j=1 qj, with α0 = β0 = 0. These   quantities partition (0, 1) into subintervals [αt−1, αt) and b P P Now, let’s determine L for the Knuth-Yao RNG. Ref. B [β − , β ) with lengths p and q , respectively. k ≡ k 1 k t k [61] showed that: Consider now the operator that takes two arguments— D b an interval and an integer—and outputs another interval: H[ p ] L H[ p ] + 2 . (6) { i} ≤ ≤ { i} ([a, b), t) = [a + (b a)α − , a + (b a)α ) . D − t 1 − t More modern proofs areb found in Refs. [54] and [58]. Generally, given a general target distribution the Hoshi’s algorithm works as follows. Set n = 0 and Knuth-Yao RNG’s L can be estimated more accurately. R0 = [0, 1). Flip the M-coin, call the result xn. Increase However, it cannot be calculated in closed form, only n by one and set R = (R − , x ). If there is a k such n D n 1 n bounded. Notably, thereb are distributions p for which that Rn Bk, then report k, else flip the M-coin again. { j} ⊆ L can be calculated exactly. These include the dyadic Han and Hoshi showed that [63]: distributions whose probabilities can be written as 2−n H[ q ] H[ q ] + f( p ) withb n an integer. For these target distributions, the { j} L { j} { i} , DDG-tree RNG has L = H[ p ]. H[ pi ] ≤ ≤ H[ pi ] { i} { } { } Equations (2) and (6) lead one to conclude that the where: b heat dissipation for generatingb one random sample is al- ways a strictly positive quantity, except for the dyadic H[ pmax, 1 pmax ] f( pi ) = ln(2(M 1)) + { − } , distributions which lead to vanishing or positive dissi- { } − 1 pmax pation. Embedding the DDG-tree RNG into a physical − with pmax = max pi. Using this and Eq. (2) we see machine, this means one must inject work to generate a i=1,··· ,M random sample. The actual amount of work depends on that the heat dissipation per sample is always positive the target distribution given. except for measure-zero cases for which the dissipation Let us look at a particular example. Consider the case may be zero or not. This means one must do work on that our source of randomness is a fair coin with half the system independent of input and output distributions to generate the target sample. Again, using this result and Eq. (2) there exist input and output distributions 5 For details see Ref. [58]. with heat dissipation at least as large as k T ln 2f( p ). B { i} 7 7

processing [65] and as a programming model for engi- processing [65] and as a programming model for engi- neering artificial systems [66, 67]. Moreover, CRN chem- neering artificial systems [66, 67]. Moreover, CRN chem- ical implementations have been studied in detail [69, 70]. ical implementations have been studied in detail [69, 70]. CRNs are also eciently Turing-universal [68], which CRNs are also eciently Turing-universal [68], which 7 power makes them appealing. One of their main appli- power makes them appealing. One of their main appli- cationsprocessing is deterministic [65] and as a function programming computation model for engi- [71, 72], cations is deterministic function computation [71, 72], whichneering is what artificial our systems RNGs [ need.66, 67]. Moreover, CRN chem- which is what our RNGs need. Considerical implementations four types have of been particles studied 0, in 1, detailA, [69 and, 70B]. and ConsiderCRNs four types are also of e particlesciently Turing-universal 0, 1, A, and B [68],and which a machine consisting of a box that can contain them. a machine consistingpower makes of them a box appealing. that can One contain of their main them. appli- Particlescations 0 is and deterministic 1 can be function inputs computation or outputs [71 to, 72 the], ma- Particles 0 and 1 can be inputs or outputs to the ma- chine.which “Machine” is what our particles RNGs need.A and B always stay in the chine. “Machine”machine’sConsider boxparticles four and types areA and of in particles contactB always 0, with 1, stayA, a and inthermal theB and reser- FIG. 3. Chemical Reaction Network (CRN) implementation machine’s boxa machine and are consisting in contact of a with box that a thermal can contain reser- them. FIG. 3. Chemicalof an RNG Reaction machine Network consisting (CRN) of a implementation box and a particle in it. voir. Figure 3 shows that the left wall is designed so thatof an RNG machine consisting of a box and a particle in it. voir. FigureParticles3 shows 0 that and the1 can left be wall inputs is ordesigned outputs so to that the ma- The left wall acts as a membrane filter such that only input onlychine. input “Machine” particles particles (0 andA 1)and canB enter,always but stay no in particles the The left wallparticles, acts as 0 and a membrane 1, can enter, filter but such no particles that only can input exit through only input particles (0 and 1) can enter, but no particles 10 canmachine’s exit. The box right and wall are in is contact designed with so a thermal that only reser- outputparticles,FIG. 3.the 0Chemical and wall. 1, can ReactionThe enter, right Network wall but no is (CRN) also particles a implementation membrane can exit designed through such that can exit. The right wall is designed so that only output theof wall. an RNG The machine right wall consisting is also of a amembrane box and a particledesigned in such it. that K Tpln(p) 0 heat from heat reservoirparticlesvoir. and turn Figure (0 it and to 3 shows 1)1/f cannoise that [exit.88]? the Nominally, To left get wall these started, is distributions designed assume so are that associ- there only output particles, 0 and 1, can exit. At the beginning the | B | ⇡ The left wall acts as a membrane filter such that only input work for us. Because the work is really small, this process ated with infinite memory processes [89]. What are the only outputonly particles, particle 0 in and the 1, box can exit. is “machine At the beginning particle” theA,whichis particlesis (0 only andonly a input1) single can particles exit. machine To (0 and get particle 1) started, can enter,A assumein but the no box. there particles Every particles,⌧ 0 and 1, can enter, but no particles can exit through is neutral process meaning there is no energy transfer associated thermodynamic cost bounds? Suggestively, is only a singlecan exit. machine The right particle wall isA designedin the sobox. that Every only output⌧ onlythe particle wall.confined Thein right the to wall box stay is also is in a “machine membrane the box. designed particle” Every such⌧ Aseconds that,whichis a new input between reservoir and machine. Nowseconds consider a the new case inputit was particle, recently shown 0 that or 1,infinite-memory enters7 from devices the can confined left. particle to stay inenters the thebox. box Every from⌧ theseconds left and, a new depending input on the when we use RNG methodseconds (von Neumanna newparticles input algorithm). (0 particle, andachieve 1) can 0thermodynamic exit. or 1, To enters get bounds started, from [90]. Third,assume the left. the there random only output particles, 0 and 1, can exit. At the beginning the Now, the particles react in the following way: only particle in the box is “machine particle” A,whichis InputIn this Output case to run a machine and generateis only one symbol, a singlenumber machine generation particle strategiesA in considered the box. here Every are not⌧ se- particle entersreaction the between box from the the input left particle and, depending and the machine on the particle, on average work reservoirNow, needs the to work particles on the machine. reactcure. in the However, following cryptographically way: secure random number reactionconfined between to stay in the the input box. particle Every ⌧ andseconds the a machine new input particle, 0 0 seconds a new input particle, 0 or 1, enters from the left. an output particle may or may not be generated that exists This thermodynamic cost can be infinitely large depends generators have been developed [91]. What are the ad- anparticle outputthrough enters particle the the box may right from or may wall.the left notAdd and, be depending a generated temperature on that the exists reservoir: a on10 how small 0 the p is. This example simplyNow, tell the us how particlesditional0+ react thermodynamicA in theA following+0 costs of, adding way: on security to reaction between the input particle and the machine particle, 11 1 0+A A +0), through therectangle right wall. underneathAdd a temperature the box, with reservoir: “temperature a T” much di↵erent the random number generation methods RNGs?1+ Finally,A thereB, is a substantial quantum advan- an output particle may or may not be generated that exists could be and each of them can be useful depends on the tage) when compressing classical random processes [76]. rectanglelabel. underneathPut the the label box,0 on with the “temperature particle entering T” on the TABLE II. Immediate random number generation: The most 1+A 0+B,A ) A +0, through the right wall. Add a temperature reservoir: a efficient map from inputsresources. to output to transform fair coin What0+ are theB thermodynamic) A +0 consequences, of using such label.rectanglePutleft. underneath the Put label the0 the labelon box, theA withon particle “temperaturethe particle entering T” in on the the box. Finally, let’s1 close with a list of very brief questions quantum)1+ representationsA B, for RNGs? inputs to biased coin outputs with bias /4. ) label. Put the label 0 on the particle entering on the that allude to several future directions in the thermo-0+B TemperatureA +0)T , left. Put the label A on the particle in the box. )1+0+B AA+0+1, . left. Put the label A on the particle in the box. dynamics of random number generation. Given that )) random number generation is such a critical and vital1+B 1+AB+1.A +1. ) ACKNOWLEDGMENTS task in modern computing, followingThe up time on this period could of each chemical) reaction is also ⌧.With digital computers generate random numbers using purely be quite important. First, is Szilard’s EngineFIG. 3. a TRNG? Chemical Reaction Network (CRN) implementation digital computers generate random numbers using purely What are the thermodynamicThe time costs?this period OneofThe assumption recent an RNG time of analysis each machine period chemical it consisting ofWe is each thank not of chemical Areaction a hard box Aghamohammadi, and to reaction a particle showis also in is M that⌧ it. also Anvari,.With if⌧.With theA Boyd, distri-digital computersdeterministic generate arithmetical random numbers methods. using This purely is pseudoran- appears to have provided the answersThethis [87] left and assumption wall antici- acts asM a it membraneKhorrami, is not hardfilter John such Mahoney, to thatshow only and1 that Pinput1 Riechers if the for distri- helpful deterministicdeterministic arithmetical arithmetical methods. methods. This This is pseudoran- is pseudoran- this assumptionbutionparticles, of it input0 andis not 1, can particlesdiscussions. hard enter, but to no 0JPC show particles and thanks 1 that can is the exit Santa if through/2 the, Fe/2 Institutedistri-then for the its dis- dom number generation (PRNG). Can these methods be pate TRNG’s win-win property. Second,bution the sources of input of particles 0 and 1 is 1/{2, 1/2 then} the dis- dom number generation (PRNG). Can these methods be RNG Physical Implementations:randomness and targetRecall distributions the first wethe considered wall. The were right wallhospitality is also a membraneduring1 visits2 designed1 as2 an External such that Faculty member.3 1dom numberimplemented generation by(PRNG). finite-state Can machines? these methods Most be certainly. bution oftribution input particles of output 0 and particles 1 is / 0, and/{ 1then would} the be3 dis-1 /4, /4implemented, by finite-state machines? Most certainly. RNG we described. Therather input limited distribution when compared is a fair to coin the veryonlytribution wide output range particles, of of outputThis 0, 1 material andparticlesγ, can is{ basedexit. 0 and At upon the 1} would beginningwork supported be /4 by,,{/ or4 , in } the only particle in the box is “machine particle” A, which3 {1 } implemented by finite-state machines? Most certainly. and the output targetstochastic distribution processes is a biased usedtribution coin in contemporary withrespectively. of output science. Forparticles Thus,part by,this 0 the and CRNJohn 1 Templeton would gives Foundation a be physical/4 and, / U.4 implemen- S., Army The e↵Theective e memory↵ective in memory these machines in these is very machines large, is very large, isrespectively. confined to stay Thus, in the box. this Every CRNτ givesseconds a a physical new input implemen- bias 1/4. TableII summarizes the optimal algorithm. Research Laboratory and the U. S. Army{ Research} Oce example, what aboutrespectively. the thermodynamicstationparticletation Thus, of of generating enters ourof this our the original original boxCRN from givesRNG. RNG.the left aand, physical depending on implemen- the Thewith e↵ theectivewith algorithms the memory algorithms typically in these allowing typically machines the allowinguser is to veryspec- the large, user to spec- Generally, optimal algorithms require the input length under contracts W911NF-13-1-0390 and W911NF-13-1- reactionEnergy between conservation the input particle means and the that machine at particle, least one of the ify the amount of state information used [? ]. Indeed, to differ from the output length—larger thantation or equal, of ourEnergyan original output particleconservation RNG. may0340. or may not means be generated that that at exists least one ofwith the theify algorithms the amount typically of state allowing information the user used to spec-[? ]. Indeed, they encourage the use of large amounts of state infor- respectively. Energyfour conservationthroughfour reactions reactions the right must wall. must means be be heat that dissipative dissipative at least or one energy or energy of absorb-the absorb-ify the amountthey encourage of state theinformation use of large used amounts [? ]. Indeed, of state infor- mation. They promise better quality random numbers in This is the main challenge to designing physical im- ing. Say why. As a result, this RNG algorithm imple- they encouragemation. the They use promise of large better amounts quality of state random infor- numbers in plementations. Note that for some inputs, afterfour they reactions areing.mentationSay must why. be requires heatAs dissipativea energy. result, Using this or Eq. energyRNG (3) algorithm we absorb- can put aimple-the sense that the recurrence time (generator period) is read, the machine should[1] waitW. G. for Cochran. additionalSamplinging. inputsSay untiltechniquesmentation why..JohnWiley&lowerAs bound requires a result, on[11] theC. energy. averageD.this Motchenbacher RNG heat Using algorithm and dissipation F. Eq. C. Fitchen. (3) per imple-Low-Noise we output: can Elec- putmation.astronomically a the They sense promise large. that Our better the concern, recurrence quality though, random istime not analyz- (generator numbers in period) is it receives the correct inputSons, and 2007. then1 transfers it deter- tronic Design. John Wiley & Sons, New York, 1973. 1 [2] B. Jerry. Discrete-eventmentation system simulationlower requiresQLB. bound Pearson0. energy.Ed-478 onkB the[12]T .B. ThisUsing average Mende, is L. a Eq. C.universal heat Noll, (3) dissipation and we lower S. can Sisodiya. bound put per SGI a over classic output:theing sense theirastronomically that implementations. the recurrence large. See Ref. Ourtime [10 concern, (generator] for a discussion though, period) is not is analyz- ministically to the output. For example, in our problem ⇡ if input 0 is read, the outputucation would India, be 1984. 0. However,1 if all possible reactionlavarand kinetics™.USPatent#5,732,138,1996. and over what 0, 1, 1A, and B of designing methods. their implementations. We can simply assume See they Ref. can [10 be] for a discussion [3] D. R. Stinson. Cryptography:lower bound theoryQLB andin on thepractice0 followingthe.478.CRC averagekB way:T[13].M. This heat G. Kendall is dissipation a and universal B. B. Smith. per Randomness lower output: andbound random astronomically over large. Our concern, though, is not analyz- 1 is read, the machine shouldpress, wait 2005. for1 the next input are.⇡ Meaning what?sampling Their numbers. materialJ. Roy. Stat. or Soc. molecular,101(1):147–166, implemented or, at least, there exist ones that have been, and then generate an output. How to implementQ these0all.478 possiblek T . Thisreaction is a kinetics universal and lower over bound what 0, over 1, A, andingB theirof implementations. design methods. See We Ref. can [10 simply] for a assume discussion they can be [4] R. G. Sargent. VerificationLB and validationmakeB of simulation up? Depending0 + A1938. A1 + on 0 , the latter, the CRN-RNG’s such as the Unix C-library random() function just cited. delays? Let’s explore a chemicalmodels. implementation In Proceedings of the of⇡ 37th theare. conferenceMeaning on Winter what?[14] D.⇒ H. Lehmer. Their Mathematical material methods in or large-scale molecular com- implemented or, at least, there exist ones that have been, all possible reactionQ can be kinetics close1 + A to and orB, far over from what the bound.0, 1, A, Since and CRNsB of designThe PRNG methods. setting We forces can us simply to forego assume accessing they a can be algorithm. simulation,pages130–1431 puting⇒ units. In Proc. 2nd Symp. on Large-Scale Digital such as the Unix C-library random() function just cited. Chemical reaction networks[5] J. Stoer(CRNs) and R. [64 Bulirsch., 65are.] haveIntroductionMeaning beenmake toare numerical up?Turing-universal what? anal-Depending0 Their + BCalculating [68A material+] on they0 Machinery + γ the , can, latter, pages orimplement 141–146.molecular the Harvard all CRN-RNG’s of Univer- the implementedsource of randomness. or, at least, The there input exist randomness ones that reservoir have been, ysis, volume 12 Springer Science & Business Media, 2013. sity⇒ Press, Cambridge, MA, 1951. 1 widely considered as substrates for physical informationQRNGscan be studied close1 +up toB to or thisA far+ point. 1 +fromγ . The the details bound. of designing Since CRNsis not randomThe at PRNG all. Rather, setting it is simply forces a pulse us to that forego in- accessing a 1 make up? Depending[15] onB.⇒ the A. Wichmann latter, and theI. D. Hill. CRN-RNG’s Algorithm AS 183: An ef- such as the Unix C-library random() function just cited. processing [66] and as[6] a programmingE. Alpaydin. Introduction model for toengi- machineCRNs learning.MIT for a givenficient RNG and algorithm portable pseudo-random can be number gleaned generator. fromJ. dicatessource that an output of randomness. should be generated. The input Thus, randomnessh =0 reservoir neering artificial systems [67, 68]. Moreover, CRNQ can chem-are beThe closeTuring-universal time to period or of far each from chemical [68 the] theyreaction bound. can is also implement Sinceτ. With CRNs all of theThe PRNG setting forces us to forego accessing a press, 2014. 1 the general proceduresRoy. Stat. given Soc. Seriesin Ref. C (Applied [71]. Statistics),31(2):188– and L = 1. In our analysis, let’s simply take the outputs ical implementations have[7] beenJ. H. studiedConway. inOn detailare numbers [ Turing-universal69 and, 70RNGs]. games,volume6.IMA,this assumption studied [68 itup] is they not to190 hard this1 can to point. show implement that The if the details distri- all of of the designingsource ofis randomness. not random at The all. input Rather, randomness it is simply reservoir a pulse that in- CRNs are also efficiently1976. Turing-universal1 [71], which bution of input particles[16] L. 0 Blum, and 1 M. is Blum,1/2, 1 and/2 M.then Shub. the A dis- simple unpredictable to be samples of any desired IID process. [8] O. Dowlen. The political potentialCRNs of sortition: for A a study given RNGpseudo-random algorithm{ number} generator. can beSIAM gleaned J. Comput., from dicates that an output should be generated. Thus, h =0 power makes them appealing. One of theirRNGs main appli- studiedtribution up toof output this particles point. 0 The and 1 would details be 3 of/4, 1 designing/4 , is notEven randomb though at a all. PRNG Rather, is supposed it is simply to generate a pulse a ran- that in- of the random selection of citizens for public oce,vol- 15(2):364–383, 1986. { } cations is deterministic function computation [72, 73], respectively. Thus, this CRN gives a physical implemen- and L = 1. In our analysis, let’s simply take the outputs ume 4. Andrews UKCRNs Limited, for2015.the1 a general givenIII. RNG procedures PSEUDORANDOM[17] algorithmM. Mascagni, given can S. in A. Cuccaro, be Ref. NUMBER gleaned [ D.71 V.]. Pryor, from and M. L. dicatesdom number, that an in output reality after should setting be the generated. seed [32, 33 Thus,]it,in h =0 which is what our RNGs[9] need.R. Y. Rubinstein and D. P. Kroese. Simulationtation of and our the original RNG.Robinson.GENERATION A fast, high quality, and reproducible parallel fact, generatesto be samples an exactly of periodic any desired sequence IID of process. outputs. Consider five particle types—0,Monte Carlo 1, A method, B,the and,volume707.JohnWiley&Sons, generalγ—and proceduresUsing Eq. (2) we given canlagged-Fibonacci put in a Ref. lower bound [71 pseudorandom]. on the aver- number generator. J. and L = 1. In our analysis, let’s simply take the outputs a machine consisting of a2011. box1 that can contain them. age heat dissipation perComput. output: PhysicsQ ,119(2):211–2190.478k T . Since Thus, to beEven ab good though PRNG a algorithm PRNG thatis supposed period should to generate a ran- LB ≈ B to be samples of any desired IID process. Particles 0 and 1 can be[10] inputsD. E. Knuth. to orThe outputs Art of from Computer the Programming:derivingSoIII. far, the Semi- webound PSEUDORANDOM abstained[18] doesJ. not Kelsey, invoke from B. Schneier, any von constraints Neumann’s and N. NUMBER Ferguson. over sin Yarrow-160: by as- be relativelydom long number, compared in reality to the sample after setting size of interest. the seed [32, 33]it,in machine and particle γ canNumerical be an output Algorithms from,volume2.Addison-Wesley,Read- the ma- inputsuming or output a source particles, ofNotes therandomness—a on bound the design is a universal and analysis fair lower coin, of the a yarrow biased cryp- EvenAlso,b the though sample a statistics PRNG should is supposed be close to to those generate of the a ran- ing, Massachusetts, second edition, 1981. 1, 7 tographicGENERATION pseudorandom number generator. In Inter- fact, generates an exactly periodic sequence of outputs. chine. “Machine” particles A and B always stay in theIII.boundcoin, PSEUDORANDOM over or any all possible general reaction IID energetics.process. NUMBER That Nevertheless, is, if we modern domdesired number, distribution. in reality This after simply setting means the if seed we estimate [32, 33]it,in machine’s box and are in contact with a thermal reser- find any four particles (molecules) obeying the four reac- Thus, to be a good PRNG algorithm that period should voir. Figure3 shows that the left wall is designed so that tions aboveGENERATION then the bound holds. Naturally, depending fact, generates an exactly periodic sequence of outputs. only input particles (0 and 1) can enter, but no particles Soon far,the reactions’ we abstained energetics, the from CRN-RNG’s von ∆Neumann’sQ can be sin byThus, as- tobe be relatively a good PRNG long compared algorithm to that the periodsample should size of interest. can exit. The right wall is designed so that only outputsumingclose to a or source far from the of bound. randomness—a Since CRNs are Turing- fair coin, a biased Also, the sample statistics should be close to those of the particles (0, 1, and γ) can exit. So far, weuniversal abstained [71] they from can implement von Neumann’s all of the RNGs stud-sin by as- be relatively long compared to the sample size of interest. To get started, assume there is only a singlesuming machinecoin, a sourceied or up toany of this randomness—ageneral point. The IID details process. of designing fair coin, Nevertheless, CRNsa for biased modernAlso, thedesired sample distribution. statistics should This be simply close to means those ofif wethe estimate particle A in the box. Every τ seconds a new input par- a given RNG algorithm can be gleaned from the general ticle, 0 or 1, enters from the left. Now, the particlescoin, reactor anyprocedures general given IID in Ref.process. [72]. Nevertheless, modern desired distribution. This simply means if we estimate 8

III. PSEUDORANDOM NUMBER GENERATION

1 So far, we abstained from von Neumann’s sin by as- suming a source of randomness—a fair coin, a biased … coin, or any general IID process. Nevertheless, modern digital computers generate random numbers using purely deterministic arithmetical methods. This is pseudoran- dom number generation (PRNG). Can these methods be implemented by finite-state machines? Most certainly. The effective memory in these machines is very large, with the algorithms typically allowing the user to spec- … ify the amount of state information used [74]. Indeed, they encourage the use of large amounts of state infor- mation, promising better quality random numbers in the sense that the recurrence time (generator period) is astro- FIG. 4. True general-distribution generator: Emit random nomically large. Our concern, though, is not analyzing samples from an arbitrary probability distribution {pi}, i = their implementations. See Ref. [10] for a discussion of 0, . . . , n − 1 where p1 to pn−1 sorted from large to small. It design methods. We can simply assume they can be im- has one internal state S and inputs and outputs can be 0, 1, plemented or, at least, there exist ones that have been, ..., n − 1. All states have energy zero. The joint states i ⊗ S for i =6 0 have nonzero energies ∆Ei. Heat is transferred only such as the Unix C-library random() function just cited. during the transition from state 0 ⊗ S to states i ⊗ S. Work The PRNG setting forces us to forego accessing a is transferred only during coupling the input bit to machine’s source of randomness. The input randomness reservoir state and decoupling the output bit from machine’s state. is not random at all. Rather, it is simply a pulse that in- dicates that an output should be generated. Thus, h = 0 and L = 1. In our analysis, we can take the outputs to ern PRNGs, is quite high. However, this takes us far be samples of any desired IID process. afield, given our focus on input-output thermodynamic Evenb though a PRNG is supposed to generate a ran- processing costs. dom number, in reality after setting the seed [35, 36] it, in fact, generates an exactly periodic sequence of out- puts. Thus, as just noted, to be a good PRNG algorithm IV. TRUE RANDOM NUMBER GENERATION that period should be relatively long compared to the sample size of interest. Also, the sample statistics should Consider situations in which no random information be close to those of the desired distribution. This means source is explicitly given as with RNGs and none is ap- 0 that if we estimate h from the sample it should be close proximated algorithmically as with PRNGs. This places to the Shannon entropy rate of the target distribution. us in the domain of true random number generators 0 0 However, in reality h = 0 since h is a measure over (TRNGs): randomness is naturally embedded in their infinite-length samples, which in this case are completely substrate physics. For example, a spin one-half quantum nonrandom due to their periodicity. particle oriented in the z+ direction, but measured in x+ − + − This is a key point. When we use PRNGs we are and x directions, gives x and x outcomes with 1/2 and only concerned about samples with comparatively short 1/2 probabilities. More sophisticated random stochas- lengths compared to the PRNG period. However, when tic process generators employing quantum physics have determining PRNG thermodynamics we average over been introduced recently [75–80]. TRNGs have also been asymptotically large samples. As a result, we have based on chaotic lasers [81, 82], metastability in elec- Q = 0 or, equivalently, ∆Q 0. And so, PRNGs LB ≥ tronic circuits [83, 84], and electronic noise [85]. What are potentially heat dissipative processes. Depending on thermodynamic resources do these TRNGs require? We the PRNG algorithm, it may be possible to find machin- address this here via one general construction. ery that achieves the lower bound (zero) or not. To date, True General-Distribution Generator: Consider the no such PRNG implementations have been introduced. general case where we want to generate a sample from Indeed, the relevant energetic cost bounds are domi- an arbitrary probability distribution p . Each time { i} nated by the number of logically irreversible computa- we need a random sample, we feed in 0 and the TRNG tion steps in the PRNG algorithm, following Landauer returns a random sample. Again, the input is a long [39]. This, from a perusal of open source code for mod- sequence 0s and, as a consequence, h = 0. We also 9 have h0 = H[ p ] and L = 1. Equation (2) puts a RNG. This means not only are the random numbers we { i} bound on the dissipated heat and input work: QLB = need being generated, but we also have an engine that k T ln 2 H[ p ]. Noticeb here that Q is a negative absorbs heat from thermal reservoir and converts it to − B { i} LB quantity. This is something that, as we showed above, work. Of course, the amount of work depends on the can never happen for RNG algorithms since they all are distribution of interest. Thus, TRNGs are a potential heat-dissipation positive: QLB > 0. Of course, QLB is win-win strategy. Imagine that at the end of charging a only a lower bound and ∆Q may still be positive. How- battery, one also had a fresh store of random numbers. ever, negative QLB opens the door to producing work Let’s pursue this further. For a given target distri- from heat instead of turning heat to dissipative work—a bution with n elements, we operate n such TRNG ma- functioning not possible for RNG algorithms. chines, all generating the distribution of interest. Any Figure4 shows one example of a physical implemen- of the n elements of the given distribution can be as- tation. The machine has a single state S and the inputs signed to the self-transition p0. This gives freedom in and outputs come from the symbol set 0, 1, , n 1 , our design to choose any of the elements. After choosing { ··· − } all with zero energies. The system is designed so that one, all the others are uniquely assigned to p1 to pn−1 the joint state 0 S has zero energy and the joint states from largest to smallest. Now, if our goal is to pump- ⊗ i S, i > 0, have energy ∆E . Recall that every time in less heat per sample, which of these machines is the ⊗ i we need a random sample we feed a 0 to the TRNG ma- most efficient? Looking closely at Eq. (7), we see that chine. Feeding 0 has no energy cost, since the sum of the amount of heat needed by machine j is proportional energies of states 0 and S is zero and equal to the energy to H( p ) p log p . And so, over all the machines, { i} − | j 2 j| of the state 0 S. Then, putting the system into contact that with the maximum p log p is the minimum- ⊗ | j 2 j| with a thermal reservoir, we have stochastic transitions heat consumer and that with minimum p log p is the | j 2 j| between state 0 S and the other states i S. Tuning the maximum-work producer. ⊗ ⊗ i S 0 S transition probabilities in a fixed time τ to Naturally, there are alternatives to the thermodynamic ⊗ → ⊗ 1 and assuming detailed balance, all the other transition transformations used in Fig.4. One can use a method probabilities are specified by the ∆Eis and, consequently, based on spontaneous irreversible relaxation. Or, one for all i 1, 2, , n 1 , we have p = exp ( β∆E ). can use the approach of changing the Hamiltonian in- ∈ { ··· − } i − i The design has the system start in the joint state 0 S stantaneously and changing it back quasistatically and ⊗ and after time τ with probability pi it transitions to state isothermally [42]. i S. Then the average heat transferred from the sys- Let’s close with a challenge. Now that a machine with ⊗ n−1 tem to the thermal reservoir is p ∆E . Now, negative ∆Q can be identified, we can go further and − i=1 i i independent the current state i S, we decouple the ma- ask if there is a machine that actually achieves the lower ⊗ P chine state S from the target state i. The average work bound QLB. If the answer is yes, then what is that ma- we must pump into the system for this to occur is: chine? We leave the answer for the future.

n−1 ∆W = p ∆E . − i i i=1 V. CONCLUSION X This completes the TRNG specification. In summary, Historically, three major approaches have been em- the average heat ∆Q and the average work ∆W are the n−1 ployed for immediate random number generation: RNG, same and equal to i=1 pi∆Ei. PRNG, and TRNG. RNG itself divides into three inter- Replacing ∆E by k T ln p we have: i P− B i esting problems. First, when we have an IID source, n−1 but we have no knowledge of the source and the goal is to design machinery that generates an unbiased ran- ∆Q = kBT pi ln pi < 0 , (7) i=1 dom number—the von Neumann RNG. Second, when we X have a known IID source generating a uniform distribu- which is consistent with the lower bound tion and the goal is to invent a machine that can generate k T ln 2 H[ p ] given above. Though, as noted − B { i} any distribution of interest—the Knuth and Yao RNG. there, a negative lower bound does not mean that we Third, we have the general case of the second, when the can actually construct a machine with negative ∆Q, in randomness source is known but arbitrary and the goal fact, here is one example of such a machine. Negative is to devise a machine that generates another arbitrary ∆Q leads to an important physical consequence. The distribution—the Roche and Hoshi RNG. For all these operation of a TRNG is a heat-consuming and work- RNGs the overarching concern is to use the minimum producing process, in contrast to the operation of an number of samples from the input source. These ap- 10 proaches to random number generation may seem rather depends on the available resources. similar and to differ only in mathematical strategy and The thermodynamic analysis of the main RNG cleverness. However, the thermodynamic analyses show strategies suggests a number of challenges. Let’s close that they make rather different demands on their physical with several brief questions that hint at several future substrates, on the thermodynamic resources required. directions in the thermodynamics of random number We showed that all RNG algorithms are heat- generation. Given that random number generation is consuming, work-consuming processes. In contrast, such a critical and vital task in modern computing, we showed that TRNG algorithms are heat-consuming, following up on these strike us as quite important. work-producing processes. And, PRNGs lie in between, First, is Szilard’s Engine [86] a TRNG? What are the dissipation neutral (∆Q = 0) in general and so the phys- thermodynamic costs in harvesting randomness? A ical implementation determines the detailed thermody- recent analysis appears to have provided the answers namics. Depending on available resources and what costs [87] and anticipates TRNG’s win-win property. Sec- we want to pay, the designer can choose between these ond, the randomness sources and target distributions three approaches. considered were rather limited compared to the wide The most thermodynamically efficient approach is range of stochastic processes that arise in contemporary TRNG since it generates both the random numbers of experiment and theory. For example, what about the interest and converts heat that comes from the thermal thermodynamics of generating 1/f noise [88]? Nomi- reservoir to work. Implementing a TRNG, however, also nally, this and other complex distributions are associated needs a physical system with inherent stochastic dynam- with infinite memory processes [89]. What are the as- ics that, on their own, can be inefficient depending on the sociated thermodynamic cost bounds? Suggestively, it resources needed. PRNG is the most unreliable method was recently shown that infinite-memory devices can since it ultimately produces periodic sequences instead of actually achieve thermodynamic bounds [90]. Third, real random numbers, but thermodynamically it poten- the random number generation strategies considered tially can be efficient. The RNG approach, though, can here are not secure. However, cryptographically secure only be used given access to a randomness source. It is random number generators have been developed [91]. particularly useful if it has access to nearly free random- What type of physical systems can be used for secure ness source. Thermodynamically, though, it is inefficient TRNG and which are thermodynamically the most since the work reservoir must do work to run the ma- efficient? One suggestion could be superconducting chine, but the resulting random numbers are reliable in nanowires and Josephson junctions near superconduct- contrast to those generated vis a PRNG. ing critical current [92]. Fourth, what are the additional To see how different the RNG and TRNG approaches thermodynamic costs of adding security to RNGs? can be, let’s examine a particular example assuming ac- Finally, there is a substantial quantum advantage when cess to a weakly random IID source with bias p 1 and compressing classical random processes [75]. What are  we want to generate an unbiased sample. We can ig- the thermodynamic consequences of using such quantum nore the randomness source and instead use the TRNG representations for RNGs? method with the machine in Fig.4. Using Eq. (7) on average to produce one sample, the machine absorbs k T p ln p 0 heat from heat reservoir and turn it | B | ≈ into work. Since the required work is very small, this ACKNOWLEDGMENTS approach is resource neutral, meaning that there is no We thank A. Aghamohammadi, M. Anvari, A. B. energy transfer between reservoir and machine. Now, Boyd, R. G. James, M. Khorrami, J. R. Mahoney, and consider the case when we use the RNG approach—the P. M. Riechers for helpful discussions. JPC thanks the von Neumann algorithm. To run the machine and gener- Santa Fe Institute for its hospitality during visits as an ate one symbol, on average the work reservoir needs pro- External Faculty member. This material is based upon vide work energy to the machine. This thermodynamic work supported by, or in part by, the John Templeton cost can be infinitely large depending on how small p is. Foundation and U. S. Army Research Laboratory and the This comparison highlights much different the random U. S. Army Research Office under contracts W911NF-13- number generation approach can be and how is useful 1-0390 and W911NF-13-1-0340.

[1] W. G. Cochran. Sampling Techniques. John Wiley & [2] B. Jerry. Discrete-event System Simulation. Pearson Ed- Sons, 2007.1 ucation India, 1984.1 11

[3] D. R. Stinson. Cryptography: Theory and Practice. CRC [22] G. Chaitin. On the length of programs for computing press, 2005.1 finite binary sequences. J. ACM, 13:145, 1966. [4] R. G. Sargent. Verification and validation of simulation [23] P. Martin-Lof. The definition of random sequences. Info. models. In Proceedings of the 37th conference on Winter Control, 9:602–619, 1966. simulation, pages 130–1431 [24] L. A. Levin. Laws of information conservation (non- [5] J. Stoer and R. Bulirsch. Introduction to Numerical Anal- growth) and aspects of the foundation of probability the- ysis, volume 12 Springer Science & Business Media, 2013. ory. Problemy Peredachi Informatsii, 10:30–35, 1974. 1 Translation: Problems of Information Transmission 10 [6] E. Alpaydin. Introduction to Machine Learning. MIT (1974) 206-210. press, 2014.1 [25] M. Li and P. M. B. Vitanyi. An Introduction to Kol- [7] J. H. Conway. On Numbers and Games, volume 6. IMA, mogorov Complexity and its Applications. Springer- 1976.1 Verlag, New York, 1993.1 [8] O. Dowlen. The Political Potential of Sortition: A study [26] A. N. Kolmogorov. Combinatorial foundations of infor- of the random selection of citizens for public office, vol- mation theory and the calculus of probabilities. Russ. ume 4. Andrews UK Limited, 2015.1 Math. Surveys, 38:29–40, 1983.1 [9] R. Y. Rubinstein and D. P. Kroese. Simulation and the [27] G. F. Knoll. Radiation Detection and Measurement. John Monte Carlo method, volume 707. John Wiley & Sons, Wiley & Sons, 2010.1 2011.1 [28] W. B. Davenport and W. L. Root. Random Signals and [10] D. E. Knuth. The Art of Computer Programming: Semi- Noise. McGraw-Hill New York, 1958.1 Numerical Algorithms, volume 2. Addison-Wesley, Read- [29] N. Yoshida, R. K. Sheth, and A. Diaferio. Non-gaussian ing, Massachusetts, second edition, 1981.1,8 cosmic microwave background temperature fluctuations [11] C. D. Motchenbacher and F. C. Fitchen. Low-Noise Elec- from peculiar velocities of clusters. Monthly Notices Roy. tronic Design. John Wiley & Sons, New York, 1973.1 Astro. Soc., 328(2):669–677, 2001.1 [12] B. Mende, L. C. Noll, and S. Sisodiya. SGI classic [30] T. Jennewein, U. Achleitner, G. Weihs, H. Weinfurter, lavarand™. US Patent #5,732,138, 1996.1 and A. Zeilinger. A fast and compact quantum random [13] M. G. Kendall and B. B. Smith. Randomness and random number generator. Rev. Sci. Instr., 71(4):1675–1680, sampling numbers. J. Roy. Stat. Soc., 101(1):147–166, 2000.1 1938.1 [31] A. Stefanov, N. Gisin, O. Guinnard, L. Guinnard, and [14] D. H. Lehmer. Mathematical methods in large-scale com- H. Zbinden. Optical quantum random number generator. puting units. In Proc. 2nd Symp. on Large-Scale Digital J. Mod. Optics, 47(4):595–598, 2000. Calculating Machinery, pages 141–146. Harvard Univer- [32] A. Acin and L. Masanes. Certified randomness in quan- sity Press, Cambridge, MA, 1951.1 tum physics. Nature, 540:213–219, 2016.1 [15] B. A. Wichmann and I. D. Hill. Algorithm AS 183: An ef- [33] A. Brandstater, J. Swift, Harry L. Swinney, A. Wolf, ficient and portable pseudo-random number generator. J. J. D. Farmer, E. Jen, and J. P. Crutchfield. Low- Roy. Stat. Soc. Series C (Applied Statistics), 31(2):188– dimensional chaos in a hydrodynamic system. Phys. Rev. 190, 1982.1 Lett., 51:1442, 1983.1 [16] L. Blum, M. Blum, and M. Shub. A simple unpredictable [34] M. Stipˇcevi´cand K. C¸. Ko¸c.True random number gen- pseudo-random number generator. SIAM J. Comput., erators. In Open Problems in Mathematics and Compu- 15(2):364–383, 1986. tational Science, pages 275–315. Springer, 2014.1 [17] M. Mascagni, S. A. Cuccaro, D. V. Pryor, and M. L. [35] J. E. Gentle. Random Number Generation and Monte Robinson. A fast, high quality, and reproducible parallel Carlo Methods. Springer Science & Business Media, New lagged-Fibonacci pseudorandom number generator. J. York, 2013.1,8 Comput. Physics, 119(2):211–219 [36] R. Y. Rubinstein and B. Melamed. Modern Simulation [18] J. Kelsey, B. Schneier, and N. Ferguson. Yarrow-160: and Modeling, volume 7. Wiley New York, 1998.1,8 Notes on the design and analysis of the yarrow cryp- [37] J. V. Neumann. Various techniques used in connection tographic pseudorandom number generator. In Inter- with random digits. In Notes by G. E. Forsythe, vol- national Workshop on Selected Areas in Cryptography, ume 12, pages 36–38. National Bureau of Standards Ap- pages 13–33. Springer, 1999. plied Math Series, 1963.1,5 [19] G. Marsaglia. Xorshift RNGs. J. Stat. Software, 8(14):1– [38] L. Devroye. Sample-based non-uniform random variate 6, 2003. generation. In Proceedings of the 18th conference on Win- [20] J. K. Salmon, M. A. Moraes, R. O. Dror, and D. E. Shaw. ter simulation, pages 260–265. ACM, 1986.2 Parallel random numbers: As easy as 1, 2, 3. In 2011 In- [39] R. Landauer. Irreversibility and heat generation in the ternational Conference for High Performance Comput- computing process. IBM J. Res. Develop., 5(3):183–191, ing, Networking, Storage and Analysis (SC), pages 1–12. 1961.2,8 IEEE, 2011.1 [40] C. H. Bennett. Thermodynamics of computation - a re- [21] A. N. Kolmogorov. Three approaches to the concept of view. Intl. J. Theo. Phys., 21:905, 1982.2 the amount of information. Prob. Info. Trans., 1:1, 1965. [41] C. Jarzynski. Equalities and inequalities: irreversibility 1 and the second law of thermodynamics at the nanoscale. 12

Ann. Rev. Cond. Matter Physics, 2(1):329–351 2011.2 [62] J. R. Roche. Efficient generation of random variables [42] J. M. R. Parrondo, J .M. Horowitz, and T. Sagawa. Ther- from biased coins. In Information Theory, Proc. 1991 modynamics of information. Nature Physics, 11(2):131– IEEE Intl. Symp., pages 169–169, 1991.6 139, 2015.2,9 [63] M. Hoshi. Interval algorithm for random number gener- [43] T. Sagawa. Thermodynamics of information processing ation. IEEE Trans. Info. Th., 43(2):599–611, 1997.6 in small systems. Prog. Theo. Physics, 127:1–56, 2012.2 [64] O. N. Temkin, A. V. Zeigarnik, and D. G. Bonchev. [44] T. M. Fiola, J. Preskill, A. Strominger, and S. P. Trivedi. Chemical reaction networks: A graph-theoretical ap- Black hole thermodynamics and information loss in two proach. CRC Press, 1996.7 dimensions. Phys. Rev. D, 50(6):3987, 1994.2 [65] M. Cook, D. Soloveichik, E. Winfree, and J. Bruck. Pro- [45] S. Das. Black-hole thermodynamics: Entropy, informa- grammability of chemical reaction networks. In Algorith- tion and beyond. Pramana, 63(4):797–815, 2004.2 mic Bioprocesses, pages 543–584. Springer, 2009.7 [46] A. Berut, A. Arakelyan, A. Petrosyan, S. Ciliberto, [66] H. Jiang, M. D. Riedel, and K. K. Parhi. Digital signal R. Dillenschneider, and E. Lutz. Experimental verifi- processing with molecular reactions. IEEE Design and cation of Landauer’s principle linking information and Testing of Computers, 29(3):21–31, 2012.7 thermodynamics. Nature, 483:187, 2012.2 [67] M. O. Magnasco. Chemical kinetics is turing universal. [47] S. Toyabe, T. Sagawa, M. Ueda, E. Muneyuki, and Phys. Rev. Lett., 78(6):1190, 1997.7 M. Sano. Experimental demonstration of information- [68] A. Hjelmfelt, E. D. Weinberger, and J. Ross. Chemical to-energy conversion and validation of the generalized implementation of neural networks and turing machines. Jarzynski equality. Nat. Physics, 6:988–992, 2010.2 Proc. Natl. Acad. Sci. USA, 88(24):10983–10987 1991.7 [48] K. Maruyama, F. Nori, and V. Vedral. Colloquium: The [69] L. Cardelli. Strand algebras for DNA computing. Natural physics of Maxwell’s demon and information. Rev. Mod. Computing, 10(1):407–428, 2011.7 Physics, 81:1, 2009.2 [70] D. Soloveichik, G. Seelig, and E. Winfree. DNA as a uni- [49] K. Sekimoto. Stochastic Energetics, volume 799. versal substrate for chemical kinetics. Proc. Natl. Acad. Springer, New York, 2010. Sci., 107(12):5393–5398, 2010.7 [50] U. Seifert. Stochastic thermodynamics, fluctuation the- [71] D. Soloveichik, M. Cook, E. Winfree, and J. Bruck. orems and molecular machines. Reports Prog. Physics, Computation with finite stochastic chemical reaction net- 75(12):126001, 2012.2 works. Natural Computing, 7(4):615–6337 [51] P. Diaconis, S. Holmes, and R. Montgomery. Dynamical [72] H. Chen, D. Doty, and D. Soloveichik. Deterministic bias in the coin toss. SIAM Review, 49(2):211–235, 2007. function computation with chemical reaction networks. 2 Natural computing, 13(4):517–534, 2014.7 [52] P. Elias. The efficient construction of an unbiased random [73] D. Doty and M. Hajiaghayi. Leaderless determinis- sequence. Ann. Math. Stat., pages 865–870, 1972.2 tic chemical reaction networks. Natural Computing, [53] Y. Peres. Iterating von Neumann’s procedure for extract- 14(2):213–223, 2015.7 ing random bits. Ann. Statistics, 20(1):590–597, 1992. [74] Unix Berkeley Software Distribution. Random(3). BSD [54] D. Romik. Sharp entropy bounds for discrete statistical Library Functions Manual, 2016.8 simulation. Stat. Prob. Lett., 42(3):219–227, 1999.2,6 [75] J. R. Mahoney, C. Aghamohammadi, and J. P. Crutch- [55] D. Mandal and C. Jarzynski. Work and information pro- field. Occam’s quantum strop: Synchronizing and com- cessing in a solvable model of Maxwell’s demon. Proc. pressing classical cryptic processes via a quantum chan- Natl. Acad. Sci. USA, 109(29):11641–11645, 2012.3 nel. Scientific Reports, 6:20495, 2016.8, 10 [56] A. C. Barato and U. Seifert. An autonomous and re- [76] C. Aghamohammadi, J. R. Mahoney, and J. P. Crutch- versible Maxwell’s demon. Europhys. Lett., 101:60001, field. Extreme quantum advantage when simulat- 2013. ing strongly coupled classical systems. arXiv preprint [57] A. B. Boyd, D. Mandal, and J. P. Crutchfield. Identifying arXiv:1609.03650, 2016. functional thermodynamics in autonomous Maxwellian [77] M. Gu, K. Wiesner, E. Rieper, and V. Vedral. Quantum ratchets. New J. Physics, 18:023049, 2016.3,5 mechanics can reduce the complexity of classical models. [58] T. M. Cover and J. A. Thomas. Elements of Information Nature Comm., 3:762, 2012. Theory. Wiley-Interscience, New York, second edition, [78] R. Tan, D. R. Terno, J. Thompson, V. Vedral, and 2006.3,4,6 M. Gu. Towards quantifying complexity with quantum [59] H. R. Lewis and C. H. Papadimitriou. Elements of the mechanics. Euro. Phys. J. Plus, 129(9):1–12 Theory of Computation. Prentice-Hall, Englewood Cliffs, [79] C. Aghamohammadi, J. R. Mahoney, and J. P. Crutch- N.J., second edition, 1998.5 field. The ambiguity of simplicity. Physics Lett. A, in [60] L. Trevisan and S. Vadhan. Extracting randomness press, 2016. arXiv preprint arXiv:1602.08646. from samplable distributions. In Foundations of Com- [80] P. M. Riechers, J. R. Mahoney, C. Aghamohammadi, puter Science, 2000. Proceedings. 41st Annual Sympo- and J. P. Crutchfield. Minimized state complexity sium, pages 32–42. IEEE, 2000.5 of quantum-encoded cryptic processes. Phys. Rev. A, [61] D. E. Knuth and A. C. Yao. The complexity of nonuni- 93(5):052317, 2016.8 form random number generation. pages 357–428. Aca- [81] A. Uchida, K. Amano, M. Inoue, K. Hirano, S. Naito, demic Press, New York, 1976.5,6 Hiroyuki Someya, Isao Oowada, T. Kurashige, M. Shiki, 13

and S. Yoshimori. Fast physical random bit generation Z. Phys., 53:840–856, 1929. 10 with chaotic semiconductor lasers. Nature Photonics, [87] A. B. Boyd and J. P. Crutchfield. Demon dynamics: De- 2(12):728–732, 2008.8 terministic chaos, the Szilard map, and the intelligence [82] I. Kanter, Y. Aviad, I. Reidler, E. Cohen, and M. Rosen- of thermodynamic systems. Phys. Rev. Lett., 116:190601, bluh. An optical ultrafast random bit generator. Nature 2016. 10 Photonics, 4(1):58–61, 2010.8 [88] W. H. Press. Flicker noises in astronomy and elsewhere. [83] D. J. Kinniment and E. G. Chester. Design of an on- Comments on Astrophysics, 7(4):103–119, 1978. 10 chip random number generator using metastability. In [89] S. Marzen and J. P. Crutchfield. Statistical signatures of Solid-State Circuits Conference, 2002. ESSCIRC 2002. structural organization: The case of long memory in re- Proceedings of the 28th European, pages 595–598. IEEE, newal processes. Phys. Lett. A, 380(17):1517–1525, 2016. 2002.8 10 [84] C. Tokunaga, D. Blaauw, and T. Mudge. True random [90] A. B. Boyd, D. Mandal, and J. P. Crutchfield. Leverag- number generator with a metastability-based quality con- ing environmental correlations: The thermodynamics of trol. IEEE J. Solid-State Circuits, 43(1):78–85, 2008.8 requisite variety. 2016. arxiv.org:1609.05353. 10 [85] M. Epstein, L. Hars, R. Krasinski, M. Rosner, and [91] C. Easttom. Modern Cryptography: Applied Mathematics H. Zheng. Design and implementation of a true random for Encryption and Information Security. McGraw-Hill number generator based on digital circuit artifacts. In Education, New York, 2015. 10 International Workshop on Cryptographic Hardware and [92] M. Foltyn and M. Zgirski. Gambling with superconduct- Embedded Systems, pages 152–165. Springer, 2003.8 ing fluctuations. Phys. Rev. App., 4(2):024002, 2015. 10 [86] L. Szilard. On the decrease of entropy in a thermody- namic system by the intervention of intelligent beings.