Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1
Entropic Description of Gravity by Thermodynamics Relativistic Fluids and the Information Theory
Roman Baudrimont Entro-π Fluid Group [email protected] | [email protected]
Abstrat: The purpose of this paper is to show a new approach to unify the theory of general relativity and quantum physics. For this, we rely on thermodynamics, fluid mechanics and the theory of information. We will then see that the Shannon entropy, Boltzmann and Von Neumann can be the source of gravity, which would be a form emerging. For this, we will study at first what is lacking for the unification of general relativity and physics. Secondly, we will explain the concept of entropic gravity by introducing calculations Erik Verlinde. Then we will explain the concept of entropy Boltzmann, Shannon, Von Neumann and the links between them. Then, we will modify Einstein's equations by transforming the tensor of perfect fluid in terms of entropy. Finally, we will link our theory with experience already carried out as part of a link between gravity and quantum theory.
Keywords: Holography, Quantum Gravity, General Relativity
1 formalization of quantum gravity difficulties.
General relativity [1, 2] is a theory of gravitation: Written by Einstein between 1907 and 1915, it states that the gravitational attraction previously known by Newton's equation is actually a distortion of space- time caused by concentrations of energy. It is described in a simplified way, by this equation:
8휋퐺 퐺 = 푇 (1) 푐
This is a tensor equation. 8휋퐺/푐^4 is a constant, the tensor 푇 is the relativistic stress energy tensor (matrix 4×4) deducted of the stress tensor (matrix 3×3). We can have also the stress energy tensor in perfect fluid, noted:
푃 푇 = + 휌 푢 푢 − 푃푔 (2) 푐²
Where 푃 the pressure, 휌 density, 푢 푢 a four-vector, and 푔 the metric tensor. Ultimately 퐺 defines the curve that takes the space given to some constant precise and stress energy tensor.
Quantum physics [3] is radically different from classical physics and relativity. Indeed, it is probabilistic: the evolution of a physical system is defined by the wave function computable from the Schrödinger equation. And one of the fundamental principles of quantum physics is this: as a quantum system has not been measured, its status is undefined. The Schrödinger equation is as follows:
휕 푖ℎ 휓(푟 ⃗, 푡) = 퐻 휓(푟 ⃗, 푡) (3) 휕푡
1
© 2019 by the author(s). Distributed under a Creative Commons CC BY license. Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1
With the Hamiltonian operator 퐻 :
ℎ² 퐻 = − ∆ + 푉(푟 ⃗, 푡) (4) 2푚
Note that the Hamiltonian gives the total energy of the system with ℎ the reduced Planck constant corresponding to the pulse, −ℎ²∆/(2푚) corresponds to the kinetic energy, 푉(푟 ⃗, 푡) potential energy of the system. The delta (∆) is a Laplacian. This equation is a first-order equation with respect to time. So if we know the state of the system at the initial time, one can know the system status at any time 푡.
Finally another fundamental principle of quantum physics is the uncertainty principle of Heisenberg:
ℎ ∆푥∆푝 ≥ (5) 2
This equation simply means that one can not simultaneously define the position and momentum with infinite precision.
The main issue is to understand the origin of gravitation. Indeed, the "infinite" appear from the time where you want mathematize gravitation in quantum equations. The quantum theories of gravity are therefore not renormalizable. So there is another formalism to standardize gravity in quantum equations? Gravitation it exist at the quantum level? So ... what is the origin of gravitation? Is it emerging?
This would in fact define a mechanical "scale" and statistics. In fact, the goal is to move from formalism very small scale (call it the quantum level) at a macroscopic formalism (thermodynamic scale including the study of ideal gas) to finally a large scale formalism ( relativistic scale). For this, a particular formality is needed. We will need basic formulas in information theory and thermodynamics.
The next part deals with the relationship between thermodynamics and gravitation. Another formalism of gravitation will thus born.
2 Thermodynamic and gravitational: Towards an entropic force of gravity.
Eric Verlinde [4, 5] published in April 2011 a 29-page document entitled "On the Origin of Gravity and the Laws of the Newton". He was the first to say that gravity could be described entropic way. I will detail the calculations by taking the case of a gas contained in a closed box placed on a movable piston, knowing that the system is kept at a temperature 푇. Since the pressure is the force exerted on a surface, and assuming the gas is perfect, we can write:
푁퐾 푇 퐹 = (6) 푋
With 푋 the height of the piston. The work associated with the variation 푑푋 is 퐹푑푋. It follows then, in view of the first law of thermodynamics that 푑푈 = 퐹푑푋 + 푇푑푆 with S the entropy of the system. We can then write the following equality 푑푆 = 푑푈/푇 + 퐹푑푋/푇 finally giving:
휕푆 푁퐾 푇 퐹 = 푇 = (7) 휕푋 푋
2
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1
But where is the gravitational force? Well gravity in general relativity is in fact a pressure force. This is a mass that has some "pressure" on a surface, the space-time, causing its curvature. And remember one of the principles of entropy: the entropy of a system of two particles is higher as the particles are close. This statement is similar to the force of gravity more two objects are, the more gravitational force is strong. We deduce then with two distinctive and masses 푚 and 푀, as well as the distance separating these two weights 푅, the following formula:
휕푆 푚푀 푁퐾 푇 푇 = 퐺 = (8) 휕푋 푅² 푋
There is another way to get the same result starting from the equation of the average kinetic energy of ideal gas molecules:
1 3 푚푣² = 퐾 푇 (9) 2 2
If we write 푣 = 2푔ℎ, we obtain the potential energy of gravity on Earth, either:
3 푚푔ℎ = 퐾 푇 (10) 2
In the end, putting pm on the other side, we have:
푚푀 3퐾 푇 퐺 = (11) 푅² 2ℎ
This is very similar to the result of Eric Verlinde.
What is amazing in the equation of Verlinde is that by giving an entropic origin to gravity, the macroscopic gravitational forces can be deduced without saying anything of what happens at the microscopic scale. The publication of this document took a stir among scientists and criticism, both positive and negative.
3 Entropy and information.
But we will go further than that. Indeed, we will see what really entropy and integrated into the suite in the Einstein equations. There are various forms equational of entropy, we will see now. The first is the entropy used below, the Boltzmann entropy [6], which is written:
푆(Ω) = 퐾 In Ω (12)
This equation defines the microcanonical entropy of a physical system at the macroscopic balance, but left free to evolve on a microscopic scale between Omega (Ω) different micro-states (also called number of complexions, or number of system configuration). The unit is in Joule per Kelvin (J / K). Entropy is the key point of the second law of thermodynamics, which states that "Any transformation of a thermodynamic system is performed with increasing the overall entropy, including the entropy of the system and the external environment. We then say that there is creation of entropy."; "The entropy in an isolated system can only increase or remain constant."
3
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1
There is also the Shannon formula [7]. The Shannon entropy, due to Claude Shannon, is a mathematical function that corresponds to the amount of information contained in or issued by a source of information. Over the source is redundant, it contains less information. Entropy is maximum and for a source whose symbols are equally likely. The Shannon entropy can be seen as measuring the amount of uncertainty of a random event, or more precisely its distribution. Generally, the log is in base 2 (binary). Its formula is:
1 푆(푝) = 푝 log (13) 푝
however, one can define an entropy in quantum theory [10], particularly used in quantum cryptography (with the properties of entanglement), called the von Neumann entropy noted:
1 퐻(휌) = −tr(휌 log 휌) = 푝 log (14) 푝
With the density 휌 and orthonormal basis matrix |푎〉 :
휌 = 푝 |푎〉〈푎| (15)
The von Neumann entropy is identical to that of Shannon, except that it uses the variable 휌, a density matrix. As written by Serge Laroche, this equation can be used to calculate the degree of entanglement of two particles: if two particles are entangled, the entropy is zero. Conversely, if the entanglement between two particles is maximum, the entropy is maximum, given we do not have access to the subsystem. In classical mechanics zero entropy means that the events are some (only one possibility), while in quantum mechanics this means that the density matrix is a pure state of 휓. But in quantum physics measurements are generally unpredictable because the probability distribution depends on the wave function and observable. And this is also explained by the principle Heisenberg uncertainty: indeed, if for example we had to have more information (so less entropy) the momentum of the particle, there is less information on the position thereof (more entropy). This implies that quantum physics is still immersed in the entropy, although the entropy is low.
Now that we know the Boltzmann entropy and Shannon entropy, we can merge the two giving the Boltzmann-Shannon entropy or statistical entropy [8]. If we consider a thermodynamic system that can be in several microscopic states 푖 of probabilities 푝 , statistical entropy is then:
1 푆(푝) = 퐾 푝 log (16) 푝
The Boltzmann entropy-Neumann, equivalent to the above equation:
푆(휌) = −퐾 tr(휌 log 휌) (17)
This function is paramount, and it will be constantly used in our theory of gravitational entropy. Its unit is the binary and Joule per Kelvin.
4
Preprints (www.preprints.org) | NOT PEER-REVIEWED | Posted: 5 March 2019 doi:10.20944/preprints201903.0053.v1
Imagine [9] two compartments (or room, as you prefer) next to each other in which particles are present. Let's call the first compartment (or room) A, and the other B. If all the particles are in A or all the particles are in B, mathematically, we note that the entropy is minimal and the uncertainty on the information is zero. If, on the other hand, the particles are in A and B in an equal manner (as many particles in A as in B) the entropy is maximal (great disorder), so the uncertainty on the information is maximum.
With this interpretation, we note that the more dispersive particles (probability less than 1) the stronger the entropy. This is the case of our particules which are separated equally in A and B (푝 = 1/2, 푆 = 2×1/2×log 2=1). In the opposite case (all the particules in A or B), the information is certain, the probability is zero, so the entropy is zero.
In the end, the Shannon entropy and Boltzmann entropy is the same concept.
To better understand the need to link the information to thermodynamics, we must understand the scope of influence of information theory in physics and the problems it could solve. Indeed, a strong example is the information paradox: the point of view of general relativity, information can completely disappear in a black hole. This is irreversible. However, in quantum physics, information must always be preserved, and is reversible (previous states can be known).
4 The modified Einstein equation: proposal of a thermodynamic form given the entropic gravity and the theory of information.
We wrote above the simplified Einstein equation. In this section, we will calculate the Einstein equation with the stress energy tensor of perfect fluids. For this, we will calculate two things: pressure and volume density.
The pressure can be written:
푛푅푇 푁퐾 푇 푃 = = (18) 푉 푉
[4] The energy according to Erik Verlinde equals 푁퐾 푇/2 so we can easily calculate the energy density: