<<

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

Entropic Description of by Relativistic Fluids and the Information Theory

Roman Baudrimont Entro-π Fluid Group [email protected] | [email protected]

Abstrat: The purpose of this paper is to show a new approach to unify the theory of and quantum . For this, we rely on thermodynamics, fluid mechanics and the theory of information. We will then see that the Shannon , Boltzmann and Von Neumann can be the source of gravity, which would be a form emerging. For this, we will study at first what is lacking for the unification of general relativity and physics. Secondly, we will explain the concept of by introducing calculations . Then we will explain the concept of entropy Boltzmann, Shannon, Von Neumann and the links between them. Then, we will modify Einstein's equations by transforming the tensor of perfect fluid in terms of entropy. Finally, we will link our theory with experience already carried out as part of a link between gravity and quantum theory.

Keywords: Holography, , General Relativity

1 formalization of quantum gravity difficulties.

General relativity [1, 2] is a theory of gravitation: Written by Einstein between 1907 and 1915, it states that the gravitational attraction previously known by Newton's equation is actually a distortion of space- time caused by concentrations of . It is described in a simplified way, by this equation:

8휋퐺 퐺 = 푇 (1) 푐

This is a tensor equation. 8휋퐺/푐^4 is a constant, the tensor 푇 is the relativistic stress energy tensor (matrix 4×4) deducted of the stress tensor (matrix 3×3). We can have also the stress energy tensor in perfect fluid, noted:

푃 푇 = + 휌 푢 푢 − 푃푔 (2) 푐²

Where 푃 the , 휌 density, 푢푢 a four-vector, and 푔 the metric tensor. Ultimately 퐺 defines the curve that takes the space given to some constant precise and stress energy tensor.

Quantum physics [3] is radically different from classical physics and relativity. Indeed, it is probabilistic: the evolution of a physical system is defined by the wave function computable from the Schrödinger equation. And one of the fundamental principles of quantum physics is this: as a quantum system has not been measured, its status is undefined. The Schrödinger equation is as follows:

휕 푖ℎ 휓(푟⃗, 푡) = 퐻휓(푟⃗, 푡) (3) 휕푡

1

© 2019 by the author(s). Distributed under a Creative Commons CC BY license. Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

With the Hamiltonian operator 퐻 :

ℎ² 퐻 = − ∆ + 푉(푟⃗, 푡) (4) 2푚

Note that the Hamiltonian gives the total energy of the system with ℎ the reduced corresponding to the pulse, −ℎ²∆/(2푚) corresponds to the , 푉(푟⃗, 푡) of the system. The delta (∆) is a Laplacian. This equation is a first-order equation with respect to time. So if we know the state of the system at the initial time, one can know the system status at any time 푡.

Finally another fundamental principle of quantum physics is the uncertainty principle of Heisenberg:

ℎ ∆푥∆푝 ≥ (5) 2

This equation simply means that one can not simultaneously define the position and momentum with infinite precision.

The main issue is to understand the origin of gravitation. Indeed, the "infinite" appear from the time where you want mathematize gravitation in quantum equations. The quantum theories of gravity are therefore not renormalizable. So there is another formalism to standardize gravity in quantum equations? Gravitation it exist at the quantum level? So ... what is the origin of gravitation? Is it emerging?

This would in fact define a mechanical "scale" and statistics. In fact, the goal is to move from formalism very small scale (call it the quantum level) at a macroscopic formalism (thermodynamic scale including the study of ) to finally a large scale formalism ( relativistic scale). For this, a particular formality is needed. We will need basic formulas in information theory and thermodynamics.

The next part deals with the relationship between thermodynamics and gravitation. Another formalism of gravitation will thus born.

2 Thermodynamic and gravitational: Towards an entropic of gravity.

Eric Verlinde [4, 5] published in April 2011 a 29-page document entitled "On the Origin of Gravity and the Laws of the Newton". He was the first to say that gravity could be described entropic way. I will detail the calculations by taking the case of a gas contained in a closed box placed on a movable piston, knowing that the system is kept at a temperature 푇. Since the pressure is the force exerted on a surface, and assuming the gas is perfect, we can write:

푁퐾푇 퐹 = (6) 푋

With 푋 the height of the piston. The associated with the variation 푑푋 is 퐹푑푋. It follows then, in view of the first law of thermodynamics that 푑푈 = 퐹푑푋 + 푇푑푆 with S the entropy of the system. We can then write the following equality 푑푆 = 푑푈/푇 + 퐹푑푋/푇 finally giving:

휕푆 푁퐾푇 퐹 = 푇 = (7) 휕푋 푋

2

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

But where is the gravitational force? Well gravity in general relativity is in fact a pressure force. This is a mass that has some "pressure" on a surface, the space-time, causing its curvature. And remember one of the principles of entropy: the entropy of a system of two particles is higher as the particles are close. This statement is similar to the force of gravity more two objects are, the more gravitational force is strong. We deduce then with two distinctive and masses 푚 and 푀, as well as the distance separating these two weights 푅, the following formula:

휕푆 푚푀 푁퐾푇 푇 = 퐺 = (8) 휕푋 푅² 푋

There is another way to get the same result starting from the equation of the average kinetic energy of ideal gas molecules:

1 3 푚푣² = 퐾 푇 (9) 2 2

If we write 푣 = 2푔ℎ, we obtain the potential energy of gravity on Earth, either:

3 푚푔ℎ = 퐾 푇 (10) 2

In the end, putting pm on the other side, we have:

푚푀 3퐾푇 퐺 = (11) 푅² 2ℎ

This is very similar to the result of Eric Verlinde.

What is amazing in the equation of Verlinde is that by giving an entropic origin to gravity, the macroscopic gravitational can be deduced without saying anything of what happens at the microscopic scale. The publication of this document took a stir among scientists and criticism, both positive and negative.

3 Entropy and information.

But we will go further than that. Indeed, we will see what really entropy and integrated into the suite in the Einstein equations. There are various forms equational of entropy, we will see now. The first is the entropy used below, the Boltzmann entropy [6], which is written:

푆(Ω) = 퐾 In Ω (12)

This equation defines the microcanonical entropy of a physical system at the macroscopic balance, but left free to evolve on a microscopic scale between Omega (Ω) different micro-states (also called number of complexions, or number of system configuration). The unit is in per Kelvin (J / K). Entropy is the key point of the second law of thermodynamics, which states that "Any transformation of a thermodynamic system is performed with increasing the overall entropy, including the entropy of the system and the external environment. We then say that there is creation of entropy."; "The entropy in an isolated system can only increase or remain constant."

3

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

There is also the Shannon formula [7]. The Shannon entropy, due to Claude Shannon, is a mathematical function that corresponds to the amount of information contained in or issued by a source of information. Over the source is redundant, it contains less information. Entropy is maximum and for a source whose symbols are equally likely. The Shannon entropy can be seen as measuring the amount of uncertainty of a random event, or more precisely its distribution. Generally, the log is in base 2 (binary). Its formula is:

1 푆(푝) = 푝 log (13) 푝

however, one can define an entropy in quantum theory [10], particularly used in quantum cryptography (with the properties of entanglement), called the von Neumann entropy noted:

1 퐻(휌) = −tr(휌 log 휌) = 푝 log (14) 푝

With the density 휌 and orthonormal basis matrix |푎〉 :

휌 = 푝 |푎〉〈푎| (15)

The von Neumann entropy is identical to that of Shannon, except that it uses the variable 휌, a density matrix. As written by Serge Laroche, this equation can be used to calculate the degree of entanglement of two particles: if two particles are entangled, the entropy is zero. Conversely, if the entanglement between two particles is maximum, the entropy is maximum, given we do not have access to the subsystem. In classical mechanics zero entropy means that the events are some (only one possibility), while in this means that the density matrix is a pure state of 휓. But in quantum physics measurements are generally unpredictable because the probability distribution depends on the wave function and observable. And this is also explained by the principle Heisenberg uncertainty: indeed, if for example we had to have more information (so less entropy) the momentum of the particle, there is less information on the position thereof (more entropy). This implies that quantum physics is still immersed in the entropy, although the entropy is low.

Now that we know the Boltzmann entropy and Shannon entropy, we can merge the two giving the Boltzmann-Shannon entropy or statistical entropy [8]. If we consider a thermodynamic system that can be in several microscopic states 푖 of probabilities 푝, statistical entropy is then:

1 푆(푝) = 퐾 푝 log (16) 푝

The Boltzmann entropy-Neumann, equivalent to the above equation:

푆(휌) = −퐾tr(휌 log 휌) (17)

This function is paramount, and it will be constantly used in our theory of gravitational entropy. Its unit is the binary and Joule per Kelvin.

4

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

Imagine [9] two compartments (or room, as you prefer) next to each other in which particles are present. Let's call the first compartment (or room) A, and the other B. If all the particles are in A or all the particles are in B, mathematically, we note that the entropy is minimal and the uncertainty on the information is zero. If, on the other hand, the particles are in A and B in an equal manner (as many particles in A as in B) the entropy is maximal (great disorder), so the uncertainty on the information is maximum.

With this interpretation, we note that the more dispersive particles (probability less than 1) the stronger the entropy. This is the case of our particules which are separated equally in A and B (푝 = 1/2, 푆 = 2×1/2×log 2=1). In the opposite case (all the particules in A or B), the information is certain, the probability is zero, so the entropy is zero.

In the end, the Shannon entropy and Boltzmann entropy is the same concept.

To better understand the need to link the information to thermodynamics, we must understand the scope of influence of information theory in physics and the problems it could solve. Indeed, a strong example is the information paradox: the point of view of general relativity, information can completely disappear in a . This is irreversible. However, in quantum physics, information must always be preserved, and is reversible (previous states can be known).

4 The modified Einstein equation: proposal of a thermodynamic form given the entropic gravity and the theory of information.

We wrote above the simplified Einstein equation. In this section, we will calculate the Einstein equation with the stress energy tensor of perfect fluids. For this, we will calculate two things: pressure and volume density.

The pressure can be written:

푛푅푇 푁퐾푇 푃 = = (18) 푉 푉

[4] The energy according to Erik Verlinde equals 푁퐾푇/2 so we can easily calculate the energy density:

푚푐 퐸 푁퐾푇 휌푐 = = = (19) 푉 푉 2푉

The part in brackets the stress energy tensor of an ideal fluid becomes easy to solve:

푁퐾 푇 푁퐾 푇 + 푃 푃 + 휌푐² 푉 2푉 3푁퐾푇 + 휌 = = = (20) 푐² 푐² 푐² 2푉푐²

The stress energy tensor becomes:

푁퐾푇 3 푁퐾푇 푇 = 푢 푢 − 푔 (21) 푉 2푐² 푉

5

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

It only remains to factor:

푁퐾푇 3푢푢 푇 = − 푔 (22) 푉 2푐²

The stress energy tensor depends on the metric tensor 푔, four-vector 푢푢 but most of the volume 푉, particle number 푁 and temperature 푇. Indeed, when the temperature increases, the power increases, due in particular to the increase in the speed of each particle. In addition, when the number of particle is high and they are present in a smaller volume, the pressure exerted on the walls of the same volume (for example a box) increases.

It is then to formalize the Einstein equation as follows:

8휋퐺 푁퐾푇 3푢푢 퐺 = − 푔 (23) 푐 푉 2푐²

Now we can write according to the statistical entropy Boltzmann-Shannon (16), which gives:

8휋퐺 푁푇 푆(푝) 3푢푢 퐺 = − 푔 (24) 푐 푉 − ∑ 푝log 푝 2푐²

Isolate entropy:

푐 퐺푉 1 3푢푢 푆(푝) = 푝log − 푔 (25) 8휋퐺푁푇 푝 2푐²

What we notice is that this time, Einstein's equation is described directly statistically by the laws of thermodynamics. Gravity becomes an emerging force from the macroscopic scale. Indeed, if one focuses on the involvement of 퐺 on 푆(푝), we notice that when the curvature increases, the entropy increases, and vice versa. It is therefore possible that the entropy affects the curvature of space-time, and even the curvature of space-time affects entropy. We can formalize the Einstein equation of quantum way. Simply use the von Neumann entropy.

푐 퐺푉 3푢푢 푆(휌) = − 푔 tr(휌 log 휌) (26) 8휋퐺푁푇 2푐²

The finding is the same as before: the curvature of space to influence the entropy. However, quantum entropy is here. This means that the smallest particle of space influences .

To show you, try husked this equation. Take the case of pure material [11]. The stress energy tensor of a continuous medium is thus written: 푇 = 휌푢푢 (27)

We would then an equivalence with the energy on a volume : The pseudo-norm of the quadrivector 푢푢 is 푐², so we can write, the scalar form of the stress energy tensor :

푁퐾푇 푇 = (28) 2푉

6

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

So Einstein’s Equation is in a scalar form :

8휋퐺 푁퐾푇 퐺 = (29) 푐 2푉

Either with 퐺, then according to the entropy:

4휋퐺 푁 푆(푝) 퐺 = (30) 푐 푉 − ∑ 푝log 푝

We can have :

푐퐺푉 1 푆(푝) = 푝 log (31) 4휋퐺푁푇 푝

If we that entropy is maximum, we have:

푐퐺푉 푆 = (32) 4휋퐺푁푇

Considering that there is only one particle and that the constants are removed, we have:

퐺푉 푆 = (33) 푇

푆푇 퐺 = (34) 푉

It is then found that the main variables defining the entropy is the curvature, temperature and volume. Similarly, the main variables defining the curvature is the entropy, the temperature and volume. Take for example a box, numbered 1 size 푙 (with 푉 = 푙 ), and another number 2 size 퐿 box (with 푉 = 퐿 ). If we deposit 푥 particles, all having the same rate noted 푣 in the size of box 1, we could calculate its entropy, that it will be appreciated 푆. Let's do the same with the box number 2, and we would have noted entropy 푆. it then noted that if 푙 < 퐿, then 푆 < 푆. in conclusion, the more the volume of a box is small, less is entropy. This is quite normal. the more particles have space to move, the greater the entropy, so the harder it is to get information. Uncertainty is high. If all particles concentrated in a smaller volume, entropy will be much lower. The volume plays a central role in the calculation of entropy. Finally, temperature plays such a central role. The higher the temperature, the more heat is high. The energy increases, so the curvature of space increases.

Now suppose that the temperature is the volume is constant. If one proceeds as earlier, removing each constant, we would have:

퐺 = 푆 (35)

This means that the entropy to a direct effect on the curvature, and vice versa. Obviously, this equation is "false", since we have tried to simplify to the maximum this equation. However, this is necessary to understand the importance of entropy in space-time.

7

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

Finally, if the curvature was equal to 1 (flat surface), and the temperature was constant, we would have:

푆 = 푉 (36)

This formula simply states that entropy increases or decreases when the volume increases or decreases, as we explained above.

If we take again the previous equation of the maximum entropy in the scalar form of the Stress Energy Tensor of the Perfect Fluids (32). And that we write in this form :

퐺푐 푐푉 푆 = × → 푆 ≡ 푉 (1) 휋퐺푁푇 4퐺

This equation is very similar to that found in holographic theory, defining a correspondence between the black holes and entropy [12, 13, 14] :

퐾푐 퐴 퐾 푐 퐴 푆 = = × → 푆 ≡ 퐴 (1) 4퐺ℎ ℎ 4퐺

It can be assumed that the information of a system is contained / stored in a volume, as we noted, not on a surface as described by the equations of the holographic theory [15]! The information is contained in a volume 푉 of space-time.

5 Towards quantum gravity.

An experiment by Edward Bruschi and his team demonstrated that there is indeed a link between gravity and entanglement [16, 17].

David Edward Bruschi, a physicist at the Institute for Quantum physics of Jerusalem to demonstrate that affects the . Disturbances of the gravitational field are proportional to the intensity of the entanglement between two particles. Depending on the distance of the two particles, their energy, coherency state, strength of quantum correlation perturbative effects of metric spacetime emerge and alter the gravitational field by low disturbances. The experience was to a Bose-Einstein condensate in which two particles are entangled, both in different orbits and different speeds. The authors sought to know if the strength of entanglement can be altered by variations in intensity of the gravitational field. For it, both microsatellites must first be orbited in the same orbit, and then one of them was to receive a sudden thrust which forces him on a second orbit, undergoing a sudden change of speed and gravity. According to calculations and simulations, physicists expect that the entanglement between the condensate lose 20% of its effectiveness

What does "entanglement of intensity"? In fact it is the degree of correlation between the two particles. And this degree can be calculated by von Neumann entropy. According to the written properties in over Haroche [17] I quote: "There is more information (less ignorance) in a pair correlated than the sum of its parts (equal if A and B are uncorrelated) "and" the degree of entanglement of an AB system appears as the measure of the increase of our ignorance when we lose the ability to make measurements on the system as a whole and that we have access locally only one of the two subsystems.".

8

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

It turns out that the equation of Verlinde explains that the entropy (the uncertainty of the information) is proportional to the gravitation of Newton! Moreover, after integration of the entropy, we find an equality between entropy and energy potential of gravitation.

It follows then that to a abrupt variation in the intensity of gravity follows an increase of the entropy, as stipulated in the thermodynamics of Boltzmann. This increase, under the Shannon (or even Boltzmann-Neumann), is seen as an increase in the disorder and also of the uncertainty of the information.

6 General conclusion.

Gravitation is an emerging and informational . It is defined according to the Boltzmann- Shannon entropy, and its quantum version Boltzmann-Neumann. Gravity is much stronger than the uncertainty of the information system is high.

If you look at Einstein's equation by solving the energy-momentum tensor of perfect fluids and taking into account thermodynamics and information theory, we note that the entropy depends mainly on the temperature of the system , curvature of space-time and volume. It is then assumed that the information is recorded in a volume of space-time.

The holographic theory says that we would be a projection of a space in 2 dimensions (the surface) in 3 dimensions. I would say, in the sense of my equation, we would be the projection in 3 dimensions of space-time, no matter how many dimensions it has. This is simpler and also more general: in the end we may very well be the projection in 3D space making 2 or 3 or 4 etc ...

7 Bibliography.

[1] Richard Taillet (2013), équation d’Einstein, Podcast de l’université de Grenoble, relativité générale, épisode 11

[2] Eric Gourgoulhon (2013-2014), Relativité générale, CNRS, Observatoire de Paris

[3] Pierre Labastie (2010-11), MECANIQUE QUANTIQUE, L3 physique fondamentale, premier semestre 2010-11

[4] Eric Verlinde (2010), On the Origin of Gravity and the Laws of Newton, arXiv:1001.0785v1

[5] David Lapeyre (2011), La gravité, une force émergeante d’origine entropique, Science Etonnante

[6] Boltzmann (1902), Leçons sur la théorie des gaz, trad. Fr. A. Galloti, Paris, Gabay, 1987 [7] C. E. Shannon (1948), A mathematical theory of communication, Bell System Tech, J. 27, p. 379-423, 623-656

[8] José-Philippe Perez (1998), L’entropie de Boltzmann et l’entropie de Shannon, même concept ?, Bulletin de l’union des physiciens, vol 92, p4

[9] Ricks Bradford (2008), Entropy and Its Inqualities, Formulation of Quantum Mechanics QM6

9

Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1

[10] Michel Le Bellac (2003), Introduction à l’information quantique, prétirage INLN 2003/08, institut non linéaire de nice UMR 6638

[11] André Lichnebrowicz (1966), Etude mathématique des fluides thermodynamiques relativistes, collège de France, Commun. Math. Phys. 1, 328-373

[12] J.D. Bekeinstein (1973), Black holes and entropy, Phys. Rev. D 7, 2333

[13] J.D. Bekenstein (1972), Black Holes And The Second Law, Lett. Nuovo Cim. 4, 737

[14] J.D. Bekeinstein (1974), Generalized second law of thermodynamics in black hole physics, Phys. Rev D 9, 3292

[15] Leonard Susskind (1995), The World as a hologram, J. Math. Phys. 36, 6377, ArXiv:hep- th/9409089)

[16] G. Vallone, D. Bacco, D. Decqual, S. Gaiarin, V. Luceri, G. Bianco, P. Villoresi (2014), Experimental Satellite Quantum Communications, ArXiv:1406.4051v1

[17] David Edward Bruschi (2015), On the weight of entanglement, Arxiv:1412.40007v2

[18] Serge Haroche (2002), Mesure de l’intrication : Entropie de Shannon et Von Neumann, Chaire de physique quantique, 5ème leçon, année 2001-2002

10