<<

arXiv:1712.04311v6 [cond-mat.stat-mech] 7 May 2018 encag ftelcltemdnmcfrert.The rate. force the thermodynamic to local the rate of inequality change change entropy first mean environmental The the the- thermodynamics. connects the stochastic within of in- them to ory interpret thanks and derived geometry, inequalities equation. formation two master report the mainly for We thermodynamics based stochastic thermodynamics on and geometry tween near-equilibrium a for valid only and is ensemble, system. Gibbs discussion the his on based then 32] informa- [22, Crooks and geometry E. thermodynamics tion between and G. link non-stationary example, a discovered For for especially dynamics. elusive non-equilibrium been has be- geom- etry link information con- and deep direct thermodynamics the between the nection of thermodynamics, spite and In information tween [36–38]. learning of and field machine mechanics, the quantum in processing, geome- attention signal remarkable neuroscience, information received as has in [35], known try geometry well differential 29– theory, of [22, technique information system the near-equilibrium Moreover, a 34]. geo- in differential dis- especially been Indeed, have cussed also thermodynamics but [22–28]. of interpretations demon, geometry metric Maxwell’s of of terms terms in in only not tracted far entropy systems the small on of equilibrium. dynamics focus from stochastic mainly in thermo- which production stochastic 21], of theory [20, informational the dynamics of on based interpretations are above quantities information The and entropy [7–19]. mutual transfer flow [3], the [4–6], Kullback-Leibler information the as such quanti- ties informational Maxwell’s revealed second of newly of interpretations have the paradox studies thermodynamic of several the Recently, context [2]. and the demon thermodynamics in last of discussed in law been studied had thermodynamic-informational well Historically, links been [1]. has decades theory information and nti etr edsoe udmna ikbe- link fundamental a discover we letter, this In at- been has relationship thermodynamic Information thermodynamics between relationship crucial The ASnmes 24.k 52.y 54.a 57.n 89. 05.70.Ln, 05.40.-a, 05.20.-y, 02.40.-k, numbers: PACS geometric reaction. information enzyme a an biochemical applied link, of numerically relation this have uncertainty We applying ther thermodynamic a cost. stochastic By as between interpreted geometry. link be information can new as a known found dynami uni have well non-stationary The we understand letter, to thermodynamics. this useful stochastic be of would context theory the in cussed tcatctemdnmcitrrtto fifraing information of interpretation thermodynamic Stochastic nrcn er,teuie hoyo nomto n thermo and information of theory unified the years, recent In IS okioUiest,N0W0 iak,Spoo Hokk Sapporo, Kita-ku, W10, N20 University, Hokkaido RIES, Dtd a ,2018) 8, May (Dated: oueIto Sosuke ed otecnevto fprobability of conservation the to leads ste ie ytetemdnmcflxfo h state the from flux equation thermodynamic master the x the by that given indicates then (2) is equation This 0. suetecondition the assume requivalently or by h rniinrate transition the nqaiiso oe fbohmclezm reaction. enzyme biochemical of model two a these near- on illustrate a inequalities numerically in We baths cost thermal system. of thermodynamic equilibrium change and entropy the another to to related state transi- a one of from ther- speed tion of between [39–51] kind relationships a thermodynamic trade-off or as relationships interpreted uncertainty be modynamic can inequality second where httastosbtensae r nue by induced are states between transitions that yaisfrtemse qain[0 1,ta salso is 53]. that [52, theory 21], network [20, Schnakenberg thermo- the equation stochastic as of master known formalism the the for with dynamics geometry, start information here we and thermodynamics stochastic utpetemlbts h atreuto o the for equation master The probability baths. thermal multiple x 70.-a ′ = tcatcthermodynamics.– Stochastic ehr osdra( a consider here We to ν { t hra ah easm o-eovleof value non-zero a assume We bath. thermal -th x 0 W , hpbtensedadthermodynamic and speed between ship [52], 1 so ytm a rmeulbim In equilibrium. from far systems of cs x nqaiyt hroyai model thermodynamic a to inequality n , . . . , ( oyaisadifraintheory information and modynamics ′ ν e hoyrvasta information that reveals theory fied → ) nomto emti inequality geometric information n yaishsbe nesvl dis- intensively been has dynamics p J x x x ( stetasto aefrom rate transition the is ν ′ dt ( d → } ) ≥ W dt x p d sgvnby given is io0102,Japan 001-0020, aido x x ( 0, := p ′ ν W → = x ) x X P =0 x n x W = ( n ′ ν ′ ν X n → bath ) x n =1 x W ( n = ν X =0 )sae ytm eassume We system. 1)-states + ′ ν x bath → =1 ) x ( > x ′ ν x X − p → ′ ) n p =0 x x X o any for 0 P x x ′ n =0 ′ )t n h tt at state the find to 1) = W 0 = eometry ocaiyaln between link a clarify To − x 6= J x ( W x x ′ ν ( , → ν ′ ′ ) → x ) W ( x → ν x p ) x d ( . x x x ′ ν ( ′ ′ → ) x p , P 6= ′ x x , x n to x < =0 ′ ealso We . x p ,which 0, induced x ) n /dt bath (3) (2) (1) (4) = - 2

(ν) ′ If dynamics are reversible (i.e., Jx′→x = 0 for any x, x and ν), the system is said to be in thermodynamic equi- librium. If we consider the conjugated thermodynamic force

(ν) (ν) (ν) F ′ := ln[W ′ px′ ] ln[W ′ px], (5) x →x x →x − x→x thermodynamic equilibrium is equivalently given by (ν) ′ Fx′→x = 0 for any x, x and ν. In stochastic thermodynamics [21], we treat the en- tropy change of thermal bath and the system in a stochas- tic way. In the transition from x′ to x, the stochastic entropy change of ν-th thermal bath is defined as FIG. 1: (color online). Schematic of information geometry on (ν) the manifold S2. The manifold S2 leads to the sphere surface bath(ν) Wx′→x ′ of radius 2 (see also SI). The statistical length L is bounded ∆σx →x := ln (ν) , (6) −1 ′ by the shortest length D = 2θ = 2 cos (rini · rfin). Wx→x and the stochastic entropy change of the system is defined as the stochastic Shannon entropy change divergence [55] between two distributions p and p′ = ′ ′ ′ sys (p0,p1,...,pn) defined as ′ ∆σx′→x := ln px ln px, (7) − n ′ px respectively. The thermodynamic force is then given by DKL(p p ) := px ln . (10) ′ || p′ the sum of entropy changes in the transition from x to x=0 x (ν) bath(ν) X x induced by ν-th thermal bath Fx′→x = ∆σx′→x + sys The square of the line element ds is defined as the second- ∆σ ′ . This fact implies that the system is in equi- x →x order Taylor series of the Kullback-Leibler divergence librium if the sum of entropy changes is zero for any transitions. n 2 tot 2 (dpx) The total entropy production rate Σ˙ is given by the ds := =2DKL(p p + dp), (11) px || sum of the products of thermodynamic forces and fluxes x=0 X over possible transitions. To simplify notations, we intro- ′ ′ where dp = (dp0, dp1,...,dpn) is the infinitesimal dis- duce the set of directed edges E = (x x, ν) 0 x < n { → | ≤ placement that satisfies dpx = 0. This square of x n, 1 ν nbath which denotes the set of all pos- x=0 ≤ ≤ ≤ } the line element is directly related to the Fisher infor- sible transitions between two states. The total entropy P production rate is then given by mation metric [54] (see also Supplementary Information (SI)). tot (ν) (ν) Σ˙ := J ′ F ′ = F , (8) The manifold Sn leads to the geometry of the n-sphere x →x x →x h i (x′→x,ν)∈E surface of radius 2 (see also Fig. 1), because the square X 2 n 2 of the line element is also given by ds = x=0(2drx) 2 where a parenthesis is defined as A := under the constraint r r = x(√px) = 1 where r (ν) (ν) h· · · i h i(ν) · P ′ J ′ A ′ for any function of edge A ′ . is the unit vector defined as r = (r0, r1,...,rn) := (x →x,ν)∈E x →x x →x x →x P (ν) (√p0, √p1,..., √pn) and denotes the inner product. PBecause signs of the thermodynamic force Fx′→x and the · (ν) The statistical length [56, 57] ′ L flux Jx →x are same, the total entropy production rate is non-negative ds := ds = dt, (12) L dt F = ∆σbath + ∆σsys 0, (9) Z Z h i h i h i≥ from the initial state rini to the final state rfin is then that is well known as the second law of thermodynamics. bounded by Information geometry.– Next, we introduce informa- −1 tion theory well known as information geometry [35]. 2cos (rini rfin) := (rini; rfin), (13) In this letter, we only consider the discrete distribution L≥ · D n group p = (p ,p ,...,pn), px 0, and px = 1. because (rini; rfin)=2θ is the shortest length between 0 1 ≥ x=0 D This discrete distribution group gives the n-dimensional rini and rfin on the n-sphere surface of radius 2, where P manifold Sn, because the discrete distribution is given θ is the angle between rini and rfin given by the inner by n +1 parameters (p0,p1,...,pn) under the constraint product rini rfin = cos θ. n · x=0 px = 1. To introduce a geometry on the mani- Stochastic thermodynamics of information geometry.– fold Sn, we conventionally consider the Kullback-Leibler We here discuss a relationship between the line element P 3 and conventional observables of stochastic thermody- Due to the non-negativity of the square of line element namics, which gives a stochastic thermodynamic inter- ds2/dt2 0, we have a thermodynamic inequality pretation of information geometric quantities. ≥ d∆σbath dF By using the master equation (1) and definitions of the . (17) dt ≥ dt line element and thermodynamic quantities Eqs. (5), (6)     and (11), we obtain stochastic thermodynamic expres- The equality holds if the system is in a stationary state, 2 2 sions of ds /dt (see also SI), i.e., dpx/dt = 0 for any x. This result (17) implies that 2 n the change of the thermodynamic force rate is transferred ds d 1 dpx 2 = px (14) to the environmental entropy change rate. The differ- dt dt −px dt x=0 ence d∆σbath/dt dF/dt 0 can be interpreted as X   n nbath n h i − h i ≥ (ν) loss in the entropy change rate transfer due to the non- d (ν) −F ′ = px W ′ e x→x (15) stationarity. If the environmental entropy change does − dt x→x x=0 ν=1 x′=0 ! bath(ν) ′ X X X not change in time (i.e., d∆σx′→x /dt = 0 for any x d∆σbath dF and x), the thermodynamic force change tends to de- = . (16) dt − dt crease (i.e., dF/dt 0) in a transition. We stress that     a mathematicalh propertyi≤ of the thermodynamic force in Equation (15) implies that geometric dynamics are driven (ν) this result is different from the second law of thermody- by the thermodynamic factor exp[ F ′ ], that is well − x→x namics F 0. discussed in the context of stochastic thermodynamics h i≥ τ From Eq. (16), the statistical length = 0 dt(ds/dt) (especially in the context of the fluctuation theorem [58– from time t =0 to t = τ is given by L 63]). The time evolution of the line element ds2/dt2 R is directly related to the expected value of the time t=τ d∆σbath dF derivative of the rate-weighted thermodynamic factor = dt . (18) (ν) L t=0 dt − dt (ν) −F s x→x′ Z     Wx→x′ e . Another expression Eq. (16) gives a stochastic ther- We then obtain the following thermodynamic inequality modynamic interpretation of information geometry, es- from Eqs. (13) and (18), pecially in case of a near-equilibrium system. The con- t=τ bath (ν) d∆σ dF dition of an equilibrium system is given by Fx′→x = 0 dt (r(0); r(τ)). (19) ′ dt − dt ≥ D for any x , x and ν. Then, the square of the line ele- Zt=0 s    ment is given by the entropy change in thermal baths The equality holds if the path of transient dynamics is a 2 bath ds d∆σ dt in a near-equilibrium system. geodesic line on the manifold Sn. This inequality gives a For≃ example, in a near-equilibrium system, the geometric constraint of the entropy change rate transfer is assumed to be the canon- in a transition between two probability distributions p(0) ical distribution px = exp(β(φ Hx)), where φ := and p(τ). −1 n − β ln[ exp( βHx))] is the Helmholtz free energy, − x=0 − Thermodynamic uncertainty.– We finally reach to a β is the inverse temperature and Hx is the Hamiltonian of P thermodynamic uncertainty relationship between speed the system in the state x. To consider a near-equilibrium and thermodynamic cost. We here consider the action t=τ transition, we assume that β and Hx can depend on time. := (1/2) dt(ds2/dt2) from time t = 0 to t = τ. 2 bath sys t=0 From ds = [ d∆σ dF ]dt = d∆σ dt, we CFrom Eq. (16), the action is given by obtain ds2 = d∆σsys −dt h = i d(β∆−H h ) dt ini a near R C t=τ bath − h i −h′ i ′ 1 d∆σ dF equilibrium system, where ∆Hx →x := Hx Hx is the = dt . (20) Hamiltonian change from the state x′ to −x. Because C 2 dt − dt Zt=0     β∆H can be considered as the entropy change of ther- − bath 2 Especillay in case of a near-equilibrium system, the ac- mal bath ∆σ , an expression ds = d(β∆H) dt for bath −h i tion is given by d∆σ /2. If we assume the canonical distribution is consistent with a near equi- the canonicalC distribution,C ≃ we have = d(β∆H) /2. 2 bath librium expression ds d∆σ dt. Even for a system far fromR equilibrium,C − weh can consideri We also discuss the second≃ order expansion of ds2/dt2 R the action as a total amount of loss in the entropy change for the thermodynamic force in SI, based on the linear rate transfer. Therefore, the action can be interpreted as irreversible thermodynamics [52]. Our discussion implies thermodynamic cost. that the square of the line element (or the Fisher in- Due to the Cauchy-Schwarz inequality formation metric) for the thermodynamic forces is re- τ τ 2 τ 2 0 dt 0 (ds/dt) dt ( 0 (ds/dt)dt) [22], we obtain lated to the Onsager coefficients. Due to the Cram´er- a thermodynamic≥ uncertainty relationship between Rao bound [54, 55], the Onsager coefficients are directly RspeedRτ and thermodynamicR cost connected to a lower bound of the variance of unbiased C estimator for parameters driven by the thermodynamic 2 τ L . (21) force. ≥ 2 C 4

FIG. 3: (color online). Numerical calculation of the thermo- FIG. 2: (color online). Numerical calculation of thermody- dynamic uncertainty relationship in the three states model namic quantities in the three states model of enzyme reaction. 2 2 of enzyme reaction. We numerically shows the geometric in- We numerically shows the non-negativity of ds /dt ≥ 0 and equality L ≥ D(r(0); r(τ)), the thermodynamic uncertainty 2 2 bath 2 2 ds /dt = −hdF/dti + hd∆σ /dti in the graph. We also relationship τ ≥ L /(2C) ≥ [D(r(0); r(τ))] /(2C), and the ef- show the total entropy change rate hF i ≥ 0. We note that ficiency η in the graph. dhF i/dt is not equal to hdF/dti.

AB with enzyme X, The equality holds if speed of dynamics ds2/dt2 does not depend on time. By using the inequality (13), we also A + X ⇋ AX, (24) have a weaker bound A + B ⇋ AB, (25) [ (r(0); r(τ))]2 AX + B ⇋ AB + X. (26) τ D . (22) ≥ 2 C We here consider the probability distribution of states In a transition from r(0) to r(τ)(= r(0)), thermody- x = A, AX, AB. We assume that the system is attached namic cost should be large if the6 transition time τ C to a single heat bath (nbath = 1) with inverse tempera- is small. In case of a near-equilibrium system, we have ture β. The master equation is given by Eq. (1), where bath 2 = d∆σ (or 2 = d(β∆H) ), and then the transition rates are supposed to be theC inequality is similar toC the− quantumh speedi limit that R R is discussed in quantum mechanics [37]. We stress that (1) (1) −β∆µAX WA→AX = kAX+[X], WAX→A = kAX+e , this result is based on stochastic thermodynamics, not (1) (1) −β∆µAB on quantum mechanics. WA→AB = kAB+[B], WAB→A = kAB+e , The inequality (22) gives the ratio between time- (1) (1) −β∆µ WAX→AB = k+[B], WAB→AX = k+e [X], averaged thermodynamic cost 2 /τ and square of the (27) velocity on manifold ([ (r(0); r(τC))]/τ)2 . Then, this ra- D tio [X] ([B]) is the concentration of X (B), kAX+, kAB+, and k are reaction rate constants, and ∆µ , ∆µ , [ (r(0); r(τ))]2 + AX AB η := D . (23) and ∆µ are the chemical potential differences. In this 2τ bath(ν) C model, the entropy change of bath ∆σx′→x is given by quantifies an efficiency for power to speed conversion. this chemical potential difference (see also SI) [64]. Due to the inequality (22) and its non-negativity, the In a numerical simulation, we set kAX+ = kAB+ = efficiency η satisfies 0 η 1, where η = 1 (η = 0) k+ = 1, β∆µAX = 1, β∆µAB = 0.5, and β∆µ = 2. implies high (low) efficiency.≤ ≤ We assume that the time evolution of the concentrations −1 −1 Three states model of enzyme reaction.– We numer- is given by [X] = tan (ωX t), [B] = tan (ωBt) with ically illustrate thermodynamic inequalities of informa- ωX = 1 and ωB = 2, which means that the concen- tion geometry by using a thermodynamic model of bio- trations [X] and [B] perform as control parameters. At chemical reaction. We here consider a three states model time t = 0, we set the initial probability distribution as (see also SI) that represents a chemical reaction A+B ⇋ (pA,pAX ,pAB)=(0.9998, 0.0001, 0.0001). 5

In Fig. 2, we numerically show the inequality [4] Sagawa, T., & Ueda, M. Generalized Jarzynski equality un- d∆σbath/dt dF/dt . We check that this inequal- der nonequilibrium feedback control. Physical Review Let- ityh does noti coincide ≥ h withi the second law of thermody- ters, 104(9), 090602 (2010). namics F 0. We also check the thermodynamic un- [5] Still, S., Sivak, D. A., Bell, A. J., & Crooks, G. E. Thermo- dynamics of prediction. Physical Review Letters, 109(12), certaintyh relationshipi ≥ τ 2/(2 ) in Fig. 3. Because ≥ L C 120604 (2012). the path from the initial distribution (pA,pAX ,pAB) = [6] Sagawa, T., & Ueda, M. Fluctuation theorem with infor- (0.9998, 0.0001, 0.0001) to the final distribution is close mation exchange: role of correlations in stochastic ther- to the geodesic line, the thermodynamic uncertainty re- modynamics. Physical Review Letters, 109(18), 180602 lationship gives a tight bound of the transition time τ. (2012). [7] Ito, S., & Sagawa, T. Information thermodynamics Conclusion.– In this letter, we reveal a link be- sys on causal networks. Physical Review Letters, 111(18), tween stochastic thermodynamic quantities (J, F , ∆σ , 180603 (2013). bath 2 ∆σ ) and information geometric quantities (ds , , , Stochastic ther- L D [8] Hartich, D., Barato, A. C., & Seifert, U. ). Because the theory of information geometry is appli- modynamics of bipartite systems: transfer entropy inequal- cableC to various fields of science such as neuroscience, sig- ities and a Maxwells demon interpretation. Journal of nal processing, machine learning and quantum mechan- Statistical Mechanics: Theory and Experiment (2014). ics, this link would help us to understand a thermody- P02016. [9] Hartich, D., Barato, A. C., & Seifert, U. Sensory capacity: namic aspect of such a topic. The trade-off relationship An information theoretical measure of the performance of between speed and thermodynamic cost Eq. (22) would a sensor. Physical Review E, 93(2), 022116 (2016). be helpful to understand biochemical reactions and gives [10] Spinney, R. E., Lizier, J. T., & Prokopenko, M. Transfer a new insight into recent studies of the relationship be- entropy in physical systems and the arrow of time. Physical tween information and thermodynamics in biochemical Review E, 94(2), 022135 (2016). processes [7, 42, 65–69]. [11] Ito, S. Backward transfer entropy: Informational measure for detecting hidden Markov models and its interpretations in thermodynamics, gambling and causality. Scientific re- ports, 6, 36831 (2016). [12] Crooks, G. E., & Still, S. E. Marginal and condi- ACKNOWLEDGEMENT tional second laws of thermodynamics. arXiv preprint arXiv:1611.04628 (2016). Thermo- I am grateful to Shumpei Yamamoto for discussions [13] Allahverdyan, A. E., Janzing, D., & Mahler, G. dynamic efficiency of information and heat flow. Journal of stochastic thermodynamics for the master equation, of Statistical Mechanics: Theory and Experiment, (2009) to Naoto Shiraishi, Keiji Saito, Hal Tasaki, and Shin- P09011. Ichi Sasa for discussions of thermodynamic uncertainty [14] Horowitz, J. M., & Esposito, M.. Thermodynamics with relationships, to Schuyler B. Nicholson for discussion continuous information flow. Physical Review X, 4(3), of information geometry and thermodynamics, and to 031015 (2014). Pieter rein ten Wolde for discussions of thermodynam- [15] Horowitz, J. M., & Sandberg, H. Second-law-like inequal- ities with information and their interpretations ics in a chemical reaction. We also thank Tamiki Ko- . New Jour- nal of Physics, 16(12), 125007 (2014). matsuzaki to acknowledge my major contribution of this [16] Shiraishi, N., & Sagawa, T. Fluctuation theorem for par- work and allow me to submit this manuscript alone. I tially masked nonequilibrium dynamics. Physical Review mentioned that, after my submission of the first version E, 91(1), 012130 (2015). of this manuscript on arXiv [70], I heard that Schuyler B. [17] Shiraishi, N., Ito, S., Kawaguchi, K., & Sagawa, T. Nicholson independently discovered a similar result such Role of measurement-feedback separation in autonomous as Eq. (16) [71]. I thank Sebastian Goldt, Matteo Polet- Maxwell’s demons. New Journal of Physics, 17(4), 045012 tini, Taro Toyoizumi, and Hiroyasu Tajima for valuable (2015). [18] Yamamoto, S., Ito, S., Shiraishi, N., & Sagawa, T. Lin- comments on the manuscript. This research is supported ear irreversible thermodynamics and Onsager reciprocity by JSPS KAKENHI Grant No. JP16K17780. for information-driven engines. Physical Review E, 94(5), 052121 (2016). [19] Goldt, S., & Seifert, U. Stochastic thermodynamics of learning. Physical Review Letters, 118(1), 010601 (2017). [20] Sekimoto, K. Stochastic energetics. (Springer, 2010). [1] Parrondo, J. M., Horowitz, J. M., & Sagawa, T. Thermo- [21] Seifert, U. Stochastic thermodynamics, fluctuation the- dynamics of information. Nature physics, 11(2), 131-139 orems and molecular machines. Reports on Progress in (2015). Physics, 75(12), 126001 (2012). [2] Leff, H. S., & Rex, A. F. (Eds.). Maxwell’s demon: [22] Crooks, G. E. Measuring thermodynamic length. Physical entropy, information, computing (Princeton University Review Letters, 99(10), 100602 (2007). Press. 2014). [23] Edward, F. H., & Crooks, G. E. Length of time’s arrow. [3] Kawai, R., J. M. R. Parrondo, & Christian Van den Physical Review Letters, 101(9), 090602 (2008). Broeck. Dissipation: The phase-space perspective. Phys- [24] Polettini, M., & Esposito, M. Nonconvexity of the rela- ical Review Letters, 98(8), 080602 (2007). tive entropy for Markov dynamics: A 6

approach. Physical Review E, 88(1), 012112 (2013). Physical Review Letters, 117(19), 190601 (2016). [25] Tajima, H., & Hayashi, M. Finite-size effect on opti- [46] Pietzonka, P., Barato, A. C., & Seifert, U. Universal mal efficiency of heat engines. Physical Review E, 96(1), bounds on current fluctuations. Physical Review E, 93(5), 012128 (2017). ; Tajima, H., & Hayashi, M. Refined 052145 (2016). Carnot’s Theorem; Asymptotics of Thermodynamics with [47] Barato, A. C., & Seifert, U. Cost and precision of Brow- Finite-Size Heat Baths. arXiv:1405.6457v1 (2014). nian clocks. Physical Review X, 6(4), 041053 (2016). [26] Shimazaki, H., Neural Engine Hypothesis. Dynamic [48] Horowitz, J. M., & Gingrich, T. R. Proof of the finite- Neuroscience. Springer, Cham, 267-291 (2018).; Shi- time thermodynamic uncertainty relation for steady-state mazaki, H., Neurons as an Information-theoretic Engine. currents. Physical Review E, 96(2), 020103 (2017). arXiv:1512.07855v1 (2015). [49] Proesmans, K., & Van den Broeck, C. Discrete-time ther- [27] Nicholson, S. B., & Kim, E. J. Investigation of the sta- modynamic uncertainty relation. EPL (Europhysics Let- tistical to reach stationary distributions. Physics ters), 119(2), 20001 (2017). Letters A, 379(3), 83-88 (2015). [50] Maes, C. Frenetic bounds on the entropy production. [28] Lahiri, S., Sohl-Dickstein, J., & Ganguli, S. A univer- Physical Review Letters. 119(16), 160601 (2017). sal tradeoff between power, precision and speed in physical [51] Dechant, A., & Sasa, S. I. Current fluctuations and communication. arXiv preprint arXiv:1603.07758 (2016). transport efficiency for general Langevin systems. arXiv [29] Weinhold, F. Metric geometry of equilibrium thermody- preprint arXiv:1708.08653 (2017). namics. The Journal of Chemical Physics, 63(6), 2479- [52] Schnakenberg, J. Network theory of microscopic and 2483 (1975). macroscopic behavior of master equation systems. Reviews [30] Ruppeiner, G. Thermodynamics: A Riemannian geomet- of Modern physics, 48(4), 571 (1976). ric model. Physical Review A, 20(4), 1608 (1979). [53] Andrieux, D., & Gaspard, P. Fluctuation theorem for [31] Salamon, P., & Berry, R. S. Thermodynamic length and currents and Schnakenberg network theory. Journal of sta- dissipated availability. Physical Review Letters, 51(13), tistical physics, 127(1), 107-131 (2007). 1127 (1983). [54] Rao, C. R. Information and the accuracy attainable in [32] Sivak, D. A., & Crooks, G. E. Thermodynamic met- the estimation of statistical parameters. In Breakthroughs rics and optimal paths. Physical Review Letters, 108(19), in (pp. 235-247). (Springer New York, 1992). 190602 (2012). [55] Cover, T. M., & Thomas, J. A. Elements of information [33] Machta, B. B. Dissipation bound for thermodynamic con- theory. (John Wiley & Sons, 2012). trol. Physical Review Letters, 115(26), 260603 (2015). [56] Wootters, W. K. Statistical distance and Hilbert space. [34] Rotskoff, G. M., Crooks, G. E., & Vanden-Eijnden, Physical Review D, 23(2), 357 (1981). E. Geometric approach to optimal nonequilibrium con- [57] Braunstein, S. L., & Caves, C. M. Statistical distance and trol: Minimizing dissipation in nanomagnetic spin sys- the geometry of quantum states. Physical Review Letters, tems. Physical Review E, 95(1), 012148 (2017). 72(22), 3439 (1994). [35] Amari, S. I., & Nagaoka, H. Methods of information ge- [58] Jarzynski, C. Nonequilibrium equality for free energy dif- ometry. (American Mathematical Soc., 2007). ferences. Physical Review Letters, 78(14), 2690 (1997). [36] Oizumi, M., Tsuchiya, N., & Amari, S. I. Unified frame- [59] Crooks, G. E. Entropy production fluctuation theorem work for information integration based on information ge- and the nonequilibrium work relation for free energy dif- ometry. Proceedings of the National Academy of Sciences, ferences. Physical Review E, 60(3), 2721 (1999). 113(51), 14817-14822 (2016). [60] Evans, D. J., & Searles, D. J. The fluctuation theorem. [37] Pires, D. P., Cianciaruso, M., C´eleri, L. C., Adesso, G., & Advances in Physics, 51(7), 1529-1585 (2002). Soares-Pinto, D. O. Generalized geometric quantum speed [61] Seifert, U. Entropy production along a stochastic trajec- limits. Physical Review X, 6(2), 021031 (2016). tory and an integral fluctuation theorem. Physical Review [38] Amari, S. I. Information geometry and its applications. Letters, 95(4), 040602 (2005). (Springer Japan, 2016). [62] Esposito, M., & Van den Broeck, C. Three faces of the [39] Uffink, J., & van Lith, J. Thermodynamic uncertainty second law. I. Master equation formulation. Physical Re- relations. Foundations of physics, 29(5), 655-692 (1999). view E, 82(1), 011143 (2010). [40] Lan, G., Sartori, P., Neumann, S., Sourjik, V., & Tu, Y. [63] Esposito, M., & Van den Broeck, C. Three detailed fluc- The energy-speed-accuracy trade-off in sensory adaptation. tuation theorems. Physical Review Letters, 104(9), 090601 Nature physics, 8(5), 422-428 (2012). (2010). [41] Govern, C. C., & ten Wolde, P. R. Optimal resource allo- [64] Schmiedl, T., & Seifert, U. Stochastic thermodynamics cation in cellular sensing systems. Proceedings of the Na- of chemical reaction networks. The Journal of chemical tional Academy of Sciences, 111(49), 17486-17491 (2014). physics, 126(4), 044101 (2007). [42] Ito, S., & Sagawa, T. Maxwell’s demon in biochemical [65] Barato, A. C., Hartich, D., & Seifert, U. (2014). Effi- signal transduction with feedback loop. Nature communi- ciency of cellular information processing. New Journal of cations, 6, 7498 (2015). Physics, 16(10), 103024. [43] Barato, A. C., & Seifert, U. Thermodynamic uncertainty [66] Sartori, P., Granger, L., Lee, C. F., & Horowitz, J. relation for biomolecular processes. Physical Review Let- M. Thermodynamic costs of information processing in ters, 114(15), 158101 (2015). sensory adaptation. PLoS computational biology, 10(12), [44] Gingrich, T. R., Horowitz, J. M., Perunov, N., & Eng- e1003974 (2014). land, J. L. Dissipation bounds all steady-state current [67] Bo, S., Del Giudice, M., & Celani, A. Thermodynamic fluctuations. Physical Review Letters, 116(12), 120601 limits to information harvesting by sensory systems. Jour- (2016). nal of Statistical Mechanics: Theory and Experiment, [45] Shiraishi, N., Saito, K., & Tasaki, H. Universal trade- (2015). P01014. off relation between power and efficiency for heat engines. [68] Ouldridge, T. E., Govern, C. C., & ten Wolde, P. R. 7

Thermodynamics of computational copying in biochemical [70] Ito, S. ”Stochastic Thermodynamic Interpretation of In- systems. Physical Review X, 7(2), 021004 (2017). formation Geometry”. arXiv preprint arXiv:1712.04311v1 [69] McGrath, T., Jones, N. S., ten Wolde, P. R., & (2017). Ouldridge, T. E. Biochemical machines for the intercon- [71] Nicholson, S. B. Uncertainty scars and the distance from version of mutual information and work. Physical Review equilibrium. arXiv preprint arXiv:1801.02242 (2018). Letters, 118(2), 028101 (2017).

SUPPLEMENTARY INFORMATION

I. Intuitive proof of the fact that the manifold S2 gives the sphere surface of radius 2

We here intuitively show the fact that the manifold S2 gives the sphere surface of radius 2. The set of probability 2 p = (p0,p1,p2) satisfies the normalization x=0 px = 1. The square of the line element ds is given by

P 2 (dp )2 ds2 = x . (28) px x=0 X 2 2 2 2 2 We here introduce the polar coordinate system (φ, ψ) where p0 = (cos ψ) , p1 = (sin ψ) (cos φ) , p2 = (sin ψ) (sin φ) . 2 We can check that the normalization x=0 px = 1 holds. By using the polar coordinate system, (dp0, dp1, dp2) 2 2 is given by dp0 = 2(cos ψ)(sin ψ)dψ, dp1 = 2(cos ψ)(sin ψ)(cos φ) dψ 2(cos φ)(sin φ)(sin ψ) dφ, and dp2 = 2(cos ψ)(sin ψ)(sin φ)2−dψ + 2(cos φ)(sin φP)(sin ψ)2dφ. From Eq. (28), we then− obtain ds2 = 4[(sin ψ)2 + (cos ψ)2(cos φ)2 + (cos ψ)2(sin φ)2](dψ)2 +0 (dφ)(dψ) + 4[(sin φ)2(sin ψ)2 + (cos φ)2(sin ψ)2](dφ)2 × =22[(dψ)2 + (sin ψ)2(dφ)2]. (29)

2 2 2 2 2 Because the metric of the sphere surface of radius R is given by ds = R [(dψ) + (sin ψ) (dφ) ], the manifold S2 gives the sphere surface of radius R = 2.

II. Detailed derivation of Eqs. (15) and (16) in the main text

We here discuss the detailed derivation of Eqs. (15) and (16) in the main text, and the relationship between the square of the line element and the Fisher information metric. (ν) (ν) (ν) ′ By using the definition of the thermodynamic force Fx′→x := ln[Wx′→xpx ] ln[Wx→x′ px], the master equation is given by −

nbath n (ν) d (ν) −F ′ p = W ′ p e x→x . (30) dt x x→x x ν=1 x′ X X=0 n 2 2 From Eqs. (28), (30) and x=0 d px/dt = 0, we obtain an expression Eq. (15) in the main text,

P 2 n 2 ds 1 dpx 2 = dt px dt x=0   Xn d 1 dpx = px dt −px dt x=0 X     n n 2 d 1 dpx d px = px 2 dt −px dt − dt x=0     x=0   Xn X d 1 dpx = px dt −px dt x=0 X   n nbath n (ν) d (ν) −F ′ = px W ′ e x→x . (31) − dt x→x x=0 ν=1 x′ ! X X X=0 8

E n nbath n (ν) (ν) Let [A] := x=0 pxA(x) be the expected value of any function A(x), and A(x) := ν=1 x′=0 Wx→x′ Ax→x′ be (ν) the rate-weighted expected value of any function of edge A ′ with a fixed initial state x, respectively. By using P x→x P P these notation, the result (31) can be rewritten as

ds2 d = E e−F . (32) dt2 − dt  

E (ν) We here mention that a parenthesis in the main text is given by A = A if Ax→x′ is an anti-symmetric function (ν) (ν) h i (ν) (ν) A ′ = A ′ . Because the thermodynamic force is an anti-symmetric function F ′ = F ′ , the total x→x − x →x   x→x − x →x entropy production rate is given by Σ˙ tot = E F . We also carefully mention that the expected value of e−F gives

−F nbath n n (ν) −Ftraj E[e ]= ′ px′ W ′ = 0, compared to the integral fluctuation theorem e = 1 with the ν=1 x=0 x =0 x →x   h itraj entropy production of trajectories Ftraj and the ensemble average of trajectories traj [1, 2]. If the system is in a P P P h· · · i stationary state, i.e., dpx/dt = 0 for any x, we have

d d E e−F = E e−F =0. (33) dt dt    h i

From Eq. (31), we also obtain

n nbath n 2 (ν) ds d (ν) −F ′ = px W ′ e x→x dt2 − dt x→x x=0 ν=1 x′ ! X X X=0 n nbath n n nbath n (ν) (ν) (ν) d (ν) −F ′ d (ν) −F ′ = px W ′ F ′ e x→x px W ′ e x→x . (34) − x→x −dt x→x − dt x→x x=0 ν=1 x′ ! x=0 ν=1 x′ ! X X X=0   X X X=0  

The first term is calculated as follows

n nbath n (ν) (ν) d (ν) −F ′ px W ′ F ′ e x→x − x→x −dt x→x x=0 ν=1 x′ ! X X X=0   nbath n n (ν) d (ν) = px′ W ′ F ′ − x →x dt x →x ν=1 x=0 x′ X X X=0   nbath nbath (ν) d (ν) (ν) d (ν) = px′ W ′ F ′ px′ W ′ F ′ − x →x dt x →x − x →x dt x →x ν=1 ′ ′ ν=1 ′ ′ X x,xX|x>x   X x,xX|x >x   (ν) d (ν) dF = Jx′→x Fx′→x = , (35) − ′ dt − dt (x →Xx,ν)∈E     9

(ν) (ν) (ν) where we used F ′ = F ′ and F ′ ′ = 0. The second term is also calculated as follows x →x − x→x x →x

n nbath n (ν) d (ν) −F ′ px W ′ e x→x − dt x→x x=0 ν=1 x′ ! X X X=0   nbath n n (ν) 1 d (ν) = px′ W ′ W ′ − x →x (ν) dt x→x ν=1 x=0 x′ W ′ X X X=0 x→x   nbath nbath n (ν) 1 d (ν) (ν) 1 d (ν) = px′ W ′ W ′ pxW W − x →x (ν) dt x→x − x→x (ν) dt x→x ν=1 ′ ′ W ′ ν=1 x=0 Wx→x X x,xX|x 6=x x→x   X X   nbath nbath n (ν) 1 d (ν) d (ν) = px′ W ′ W ′ + px W ′ − x →x (ν) dt x→x  dt x→x  ν=1 ′ ′ W ′ ν=1 x=0 x′ x X x,xX|x>x x→x   X X X6= nbath nbath   (ν) d (ν) (ν) d (ν) = px′ W ′ ln(W ′ ) + px′ W ′ ln(W ′ ) − x →x dt x→x x →x dt x →x ν=1 ′ ′ ν=1 ′ ′ X x,xX|x6=x   X x,xX|x 6=x   nbath (ν) d bath(ν) = p ′ W ′ ∆σ ′ x x →x dt x →x ν=1 ′ ′ X x,xX|x 6=x   nbath nbath (ν) d bath(ν) (ν) d bath(ν) = px′ W ′ ∆σ ′ pxW ′ ∆σ ′ x →x dt x →x − x→x dt x→x ν=1 ′ ′ ν=1 ′ ′ X x,xX|x>x   X x,xX|x>x   bath (ν) d bath(ν) d∆σ = Jx′→x ∆σx′→x = , (36) ′ dt dt (x →Xx,ν)∈E    

(ν) (ν) bath(ν) bath(ν) bath(ν) where we used W ′ ′ = ′ W ′ , ∆σ ′ = ∆σ ′ and ∆σ ′ ′ = 0. x →x − x6=x x →x x →x − x→x x →x P (ν) bath(ν) sys By using Fx′→x = ∆σx′→x′ + ∆σx′→x′ , we obtain an expression

ds2 d∆σbath dF d∆σsys = = . (37) dt2 dt − dt − dt      

Let (λ1,...,λn′ ) be the set of parameters such as control parameters. We also obtain the definition of the Fisher information metric [3]

n ∂ ln p ∂ ln p ∂ ln px ∂ ln px gij = E = px (38) ∂λi ∂λj ∂λi ∂λj x=0     X     10 from the result (37),

ds2 d∆σbath dF = dt2 dt − dt     (ν) 1 dpx′ 1 dpx = J ′ x →x ′ − ′ px dt − px dt (x →Xx,ν)∈E   n n n n n bath bath ′ ′ (ν) 1 (ν ) 1 (ν ) ′ ′′ ′′ = px Wx′→x Wx′′→x′ px Wx′′→xpx − px′ − px ν=1 ν′=1 x=0 x′=0 x′′=0   X X X X X ′ n nbath n (ν) nbath n (ν ) ′ ′′ px Wx′→x px Wx′′→x = px px px x=0 " ν=1 x′=0 ν′=1 x′′=0 # Xn X X X X d ln p 2 = p x x dt x=0   X 2 n n′ ∂ ln px dλi = px  ∂λi dt  x=0 i=1 X X n′ n′   dλ dλ = i g j , (39) dt ij dt i=1 j=1 X X n (ν) n n (ν) bath ′ where we used x=0 Wx′→x = 0 and the master equation dpx/dt = ν=1 x′=0 px Wx′→x. This result is consistent with the following calculation about the Fisher information metric P P P 2 n n n′ n′ n′ 2 2 ∂ ln px ds = px(d ln px) = px dλi = gij dλidλj . (40)  ∂λi  x=0 x=0 i=1 i=1 j=1 X X X   X X  

III. Linear irreversible thermodynamic interpretation of information geometry

We here discuss a stochastic thermodynamic interpretation of information geometry in a near-equilibirum sys- tem, where the entropy production rate is given by the second order expansion for the thermodynamic flow (or the thermodynamic force). This second order expansion is well known as linear irreversible thermodynamics [4]. (ν) (ν) (ν) If we assume Fx′→x = 0, we have Jx′→x = 0. Thus, we have a linear expansion of thermodynamic force Fx′→x in (ν) (ν) ′ terms of the thermodynamic flow J ′ for a near-equilibrium condition (i.e., F ′ 0 for any x and x ) x →x x →x ≃ (ν) (ν) Jx′→x ′ Fx →x = ln 1+ (ν) Wx′→xpx ! (ν) (ν) (ν) = αx′→xJx′→x + o(Jx′→x), (41)

(ν) 1 ′ αx →x := (ν) . (42) Wx′→xpx F (ν) =0 x′→x

(ν) ′ We call this coefficient αx′→x as the Onsager coefficient of the edge (x x, ν). The symmetry of the coefficient (ν) (ν) (ν) → αx′→x = αx→x′ holds due to the condition Fx′→x = 0. (ν) If we consider the Kirchhoff’s current law in a stationary state, the linear combination of the coefficient αx→x′ leads to the Onsager coefficient [4]. Let C ,...,Cm be the cycle basis of the Markov network for the master equation. { 1 } The thermodynamic force of the cycle F (Ci) is defined as

′ (ν) F (Ci)= S( x x, ν , Ci)Fx′→x (43) ′ { → } (x →Xx,ν)∈E 11 where

′ 1 ( x x, ν Ci) ′ { → ′ } ∈ S( x x, ν , Ci)= 1 ( x x ,ν Ci) . (44) { → } − { → } ∈ 0 (otherwise)



The thermodynamic flow of the cycle J(Ci) is defined as

m (ν) ′ J ′ = S( x x, ν , Ci)J(Ci). (45) x →x { → } i=1 X

m m −1 We then obtain the linear relationship F (Cj ) = i=1 LjiJ(Ci) (or J(Cj ) = i=1 Lji F (Ci)) with the Onsager coefficient P P

(ν) ′ ′ Lij = αx′→xS( x x, ν , Ci)S( x x, ν , Cj ), (46) ′ { → } { → } (x →Xx,ν)∈E for a near-equilibrium condition, the second law of thermodynamics

0 Σ˙ tot ≤ (ν) (ν) = Jx′→xFx′→x ′ (x →Xx,ν)∈E m ′ (ν) = S( x x, ν , Ci)J(Ci)F ′ { → } x →x ′ i=1 (x →Xx,ν)∈E X m = J(Ci)F (Ci), i=1 Xm m = Lij J(Ci)J(Cj ), j=1 i=1 X X m m −1 = Lij F (Ci)F (Cj ), j=1 i=1 X X (47)

and the Onsager reciprocal relationship Lij = Lji. This result gives the second order expansion of the entropy production rate Σ˙ tot for the thermodynamic flow J (or the thermodynamic force F ) in a stationary state. For m = 2, 2 2 the second law of thermodynamics L11F (C1) + L22F (C2) +2L12F (C1)F (C2) 0 is then given by L11 0, L22 0, and L L L2 0. ≥ ≥ ≥ 11 22 − 12 ≥ Here we newly consider the second order expansion of ds2 for the thermodynamic flow J (or the thermodynamic force F ) in linear irreversible thermodynamics. In a near-equilibrium system, the square of line element ds is calculated 12 as follows

d∆σsys ds2 = dt2 − dt   ′ (ν) 1 dpx 1 dpx 2 = J ′ dt x →x ′ − ′ px dt − px dt (x →Xx,ν)∈E   n n n n bath ′ bath ′ (ν) 1 (ν ) 1 (ν ) 2 = Jx′→x Jx′′→x′ Jx′′→x dt − px′ − px ′ " ν′ x′′ ν′ x′′ # (x →Xx,ν)∈E X=1 X=0 X=1 X=0 n n bath ′ ′ (ν) 1 (ν ) 1 (ν ) 2 = Jx′→x Jx′′→x Jx′′→x′ dt px − px′ ′ ν′ x′′ (x →Xx,ν)∈E X=1 X=0   n n bath ′ ′ (ν) 1 (ν ) 1 (ν ) 2 = Jx′→x Jx′′→x Jx′′→x′ dt px − px′ (x′→x,ν)∈E ν′=1 x′′=0   X X X ′ nbath nbath n n n (ν) (ν ) J ′ J ′′ = x →x x →x dt2 px ν=1 ν′=1 x=0 x′=0 x′′=0 " # X X X X X ′ nbath nbath n n n (ν) (ν ) Fx′→xFx′′→x 2 = (ν) (ν′) dt . (48) ν=1 ν′ x=0 x′ x′′ " α ′ pxα ′′ # X X=1 X X=0 X=0 x →x x →x

′ We here consider the situation that the time evolution of control parameters λ(x ,x,νx) is driven by the thermodynamic (νx) ′ ′ force Fx →x = dλ(x ,x,νx)/dt. The square of line element can be written by the following Fisher information metric

nbath n n nbath n n

2 ′ ′ ′ ′ ds = g(x ,x,νx)(y ,y,νy)dλ(x ,x,νx)dλ(y ,y,νy ) (49) ν =1 x=0 x′ ν =1 y=0 y′ Xx X X=0 Xy X X=0 δxy ′ ′ g(x ,x,νx)(y ,y,νy ) = . (50) (νx) (νy ) αx′→xpxαy′→y

′ This result implies that the Fisher information metric for control parameters λ(x ,x,νx) driven by the thermodynamic (νx) (νx) ′ ′ ′ force Fx →x = dλ(x ,x,νx)/dt is related to the Onsager coefficients of the edge αx →x for a near-equilibrium condition. Because the Cram´er-Rao bound [3, 5] implies that the variance of unbiased estimator is bounded by the inverse of this (νx) Fisher information metric, the Onsager coefficients of the edge αx′→x gives a lower bound of the variance of unbiased estimator for control parameters driven by the thermodynamic forces in a near-equilibrium system.

IV. Detail of the three states model of enzyme reaction

Stochastic thermodynamics for the master equation is applicable to a model of chemical reaction [64]. We here discuss the thermodynamic detail of the three states model of enzyme reaction discussed in the main text. The master equation for Eq. (27) in the main text is given by

dpA = (kAX [X]+ kAB [B])pA + kAB pAB + kAX pAX , dt − + + − − dpAB = kAB [B]pA (kAB + k [X])pAB + k [B]pAX , dt + − − − + dpAX = kAX [X]pA + k [X]pAB (kAX + k [B])pAX , (51) dt + − − − + 13 where k−, kAB− and kAX− are given by the chemical potential differences

kAX+ ln = β∆µAX , kAX− kAB+ ln = β∆µAB, kAB− k ln + = β∆µ. (52) k−

We here assume that the sum of the concentrations [A] + [AB] + [AX] = nA is constant. The probabilities distributions pA, pAB, and pAX correspond to the fractions of pA = [A]/nA, pAB = [AB]/nA and pAX = [AX]/nA, respectively. From the master equation (51), we obtain the rate equations of enzyme reaction

d[A] = (kAX [X]+ kAB [B])[A]+ kAB [AB]+ kAX [AX], dt − + + − − d[AB] = kAB [B][A] (kAB + k [X])[AB]+ k [B][AX], dt + − − − + d[AX] = kAX [X][A]+ k [X][AB] (kAX + k [B])[AX]. (53) dt + − − − + which corresponds to the following enzyme reaction

A + X ⇋ AX, A + B ⇋ AB, AX + B ⇋ AB + X, (54) where A is substrate, X is enzyme, AX is enzyme-substrate complex, and AB is product. In this model, the stochastic entropy changes of thermal bath are also calculated as

bath(1) ∆σA→AB = β∆µAB + ln[B], ∆σbath(1) = β∆µ ln[B] + ln[X], AB→AX − − bath(1) ∆σ = β∆µAX ln[X], (55) AX→A − − which are the conventional definitions of the stochastic entropy changes of a thermal bath. In this model, the cycle basis is given by one cycle C = (A AB AX A) . If the chemical potential change in a cycle C has non-zero { 1 → → → } 1 value, i.e., ∆µcyc := ∆µAB ∆µ ∆µAX = 0, the system in a stationary state is driven by the thermodynamic force (1) − −(1) 6 (1) of the cycle F (C1)= FA→AB + FAB→AX + FAX→A = β∆µcyc. In a numerical calculation, we set β∆µcyc = 2.5 = 0. Then we consider non-equilibrium and non-stationary dynamics in a numerical calculation. − 6

[1] Seifert, U. Entropy production along a stochastic trajectory and an integral fluctuation theorem. Physical review letters, 95(4), 040602 (2005). [2] Esposito, M., & Van den Broeck, C. Three detailed fluctuation theorems. Physical review letters, 104(9), 090601 (2010). [3] Cover, T. M., & Thomas, J. A. Elements of . (John Wiley & Sons, 2012). [4] Schnakenberg, J. Network theory of microscopic and macroscopic behavior of master equation systems. Reviews of Modern physics, 48(4), 571 (1976). [5] Rao, C. R. Information and the accuracy attainable in the estimation of statistical parameters. In Breakthroughs in statistics (pp. 235-247). (Springer New York, 1992). [6] Schmiedl, T., & Seifert, U. Stochastic thermodynamics of chemical reaction networks. The Journal of chemical physics, 126(4), 044101 (2007).