Causality Is an Effect, II
Total Page:16
File Type:pdf, Size:1020Kb
entropy Article Causality Is an Effect, II Lawrence S. Schulman Physics Department, Clarkson University, Potsdam, NY 13699-5820, USA; [email protected] Abstract: Causality follows the thermodynamic arrow of time, where the latter is defined by the direction of entropy increase. After a brief review of an earlier version of this article, rooted in classical mechanics, we give a quantum generalization of the results. The quantum proofs are limited to a gas of Gaussian wave packets. Keywords: causality; arrow of time; entropy increase; quantum generalization; Gaussian wave packets 1. Introduction The history of multiple time boundary conditions goes back—as far as I know—to Schottky [1,2], who, in 1921, considered a single slice of time inadequate for prediction or retrodiction (see AppendixA). There was later work of Watanabe [ 3] concerned with predic- tion and retrodiction. Then, Schulman [4] uses this as a conceptual way to eliminate “initial conditions” prejudice from Gold’s [5] rationale for the arrow of time and Wheeler [6,7] discusses two time boundary conditions. Gell–Mann and Hartle also contributed to this subject [8] and include a review of some previous work. Finally, Aharonov et al. [9,10] pro- posed that this could solve the measurement problem of quantum mechanics, although this is disputed [11]. See also AppendixB. This formalism has also allowed me to come to a conclusion: effect follows cause in the direction of entropy increase. This result is not unanticipated; most likely everything Citation: Schulman, L.S. Causality Is having to do with arrows of time has been anticipated. What is unusual, however, is the an Effect, II. Entropy 2021, 23, 682. ability to prove the thesis mathematically. https://doi.org/10.3390/e23060682 The proof given [12] only applies to classical mechanics. The present work extends it to quantum mechanics. Moreover, since the previous paper was buried in conference Academic Editor: Ken Wharton proceedings, and the present work uses similar arguments, I will repeat some parts of that proof. Note though that the present work is extremely limited: it applies to particles in a Received: 9 May 2021 gas with Gaussian wave functions. I think it should apply more generally, but that is not Accepted: 24 May 2021 what I prove. Published: 28 May 2021 A remark is in order on the arrow of time. Often the sequence of cause and effect is taken as primary [13], but in the present context this is shown not to hold. Our argument Publisher’s Note: MDPI stays neutral depends on two conditions: the nature of perturbation and two-time boundary condi- with regard to jurisdictional claims in tions. Both will be elaborated on. A future condition on quantum problems may make published maps and institutional affil- iations. measurement deterministic but is hollow in that it does not lead (to my knowledge) to testable claims. There is the possibility of advanced effects ([6,14]) but so far, this has not proved measurable. 2. Perturbation Copyright: © 2021 by the authors. A “cause” can be a change (or intervention, in the dynamics or in the initial conditions) Licensee MDPI, Basel, Switzerland. or it can be a precursor (“pre” in the usual meaning of the term). In a change, you can This article is an open access article compare the outcome of two similar but not identical “causes.” The case of “precursor” distributed under the terms and conditions of the Creative Commons is different. You can say one thing leads to another, A is a precursor of B, but you only 0 Attribution (CC BY) license (https:// consider A, not some variant, say A . I will adopt the meaning “change.” This allows study creativecommons.org/licenses/by/ of “effects”, whereas I do not know how to examine ordinary time evolution. One should 4.0/). Entropy 2021, 23, 682. https://doi.org/10.3390/e23060682 https://www.mdpi.com/journal/entropy Entropy 2021, 23, 682 2 of 8 bear in mind, however, that in ordinary language both meanings of “cause” can be used. This is also discussed in Appendix B of [12], where a rationale for this choice is given. 3. Review of Previous Work The argument for causality to follow the direction of entropy increase depends on the dynamics. The principle assumption on the dynamics is that there is a relaxation time. Call it t (this is aside from the cosmological assumptions, discussed below). Let the dynamics be f, so that from time t to t + 1 a subset of e of the phase space goes from e to fe. Then, by giving two times separated by more than 2t and low entropy in the beginning and end, one gets increase, then constant, then decline of the entropy. Thus, the set of points satisfying both boundary conditions (e0 at 0 and eT at T) at time 0 is −T e = e0 \ f eT . (1) Note that both beginning and end are forced to have low entropy (by definition, assuming e0 and eT are small). This is illustrated for the “cat map” in the first graph of Figure1. See AppendixC for information on the “cat map” and on our procedure. Figure 1. The entropy as a function of time for the “cat map” is plotted. In all three figures the unperturbed entropy is shown. In the second figure the perturbation (with circles around values) takes place at time t = 3, while in the third figure the perturbation is at time t = 13. In the latter two cases the change in macroscopic behavior is subsequent in the sense of increasing entropy. (The graphs are inverted from the usual definition of entropy.) Then, we do something different! Two times for perturbation are contemplated: the first during the rising of the entropy, the second during its fall. By detailed calculation (not reproduced here, but see below, Section5) it is found that causality, the macroscopic Entropy 2021, 23, 682 3 of 8 change in behavior, follows the direction of entropy increase, that is in the falling entropy case, the macroscopic behavior changes earlier in the time parameter (which is neutral, i.e., has two different arrows and in between doesn’t have an arrow). Intuitively, this is simple. Let the perturbation be at t0 and 0 < t0 < t < T − t < T. Between 0 and t0 (before the perturbation) both perturbed and unperturbed have the same boundary conditions, hence their macroscopic behavior is similar (but the microscopic behavior is, in general, different). After the perturbation there is no constraint, i.e., no effective boundary condition: both systems go to equilibrium, from which there is adequate time (recall T > 2t) to reach (whether perturbed or not) a specific region within the unit square. There was however a perturbation and there are different microscopic and macroscopic paths. Similarly, for T − t < t0 < T the boundary conditions at eT and t0 (just after the perturbation in the neutral parameter t) are the same for perturbed and unperturbed systems, hence the macroscopic behavior is the same. However, they both go (in (−t)) to equilibrium; hence, the two systems (perturbed and unperturbed) have different macroscopic paths (N.b., distinguish carefully the words “macroscopic” and “microscopic.”). This is illustrated in the case of the cat map, both for 0 < t0 < t and T − t < t0 < T. See Figure1, second and third images. 4. Quantum Version The first step is to show that with low entropy conditions at both (distantly separated) times, the entropy increases in between. To calculate entropy in subspaces of Hilbert space presence or absence in that subspace should be defined and numbers of states counted. This leads to having regions of 6-dimensional x-p space—to mimic the classical space—and that is possible. What we do is take a region of phase space and select a basis of states whose maximum value is in this region. Coherent states will do the job. For convenience, the value of the spread used to define those states might be the fixed point of [15], but that is not necessary. Finally the dimension of the Hilbert subspace will be the number of states in that region. What this means is that one can go back to the classical method of using (the logarithm of) volume in phase space as a measure of entropy. Therefore, as was done previously, we take coarse grains that are regions of phase space—coordinates and momentum. The density matrix involves mainly the diagonal elements. Thus even for identical particles the separation of locale as well as separation of momenta makes the density matrix nearly diagonal. For more on this theme, see [16]. Even for identical particles this leads to cancellation. In one dimension the wave function for a pair of Gaussians is (not normalized) ( − )2 (x −x )2 exp − x1 xa + 2 b + ik x + ik x 4s2 4s2 a 1 b 2 a b (2) 2 2 (x2−xa ) (x1−xb) ± exp − 2 + 2 + ikax2 + ikbx1 . 4sa 4sb (The variables are x1 and x2; all the others are constants.) The diagonal elements of the density matrix (calculated from Equation (2)) already show signs of cancellation, as follows: 2 2 2 2 (x1−xa ) (x2−xb) (x2−xa ) (x1−xb ) r(x1, x2; x1, x2) = exp − 2 − 2 + exp − 2 − 2 2sa 2sb 2sa 2sb 2 2 2 2 (x1−xa) (x2−xb ) (x2−xa ) (x1−xb ) + exp − 2 − 2 − 2 − 2 (3) 4sa 4sb 4sa 4sb cos ×2 k − k [x − x ] . i sin a b 1 2 As is evident, if xa is significantly different from xb or ka from kb, then there is already cancellation or rapid oscillation. With more particles the effect is stronger.