Hidden Determinism, Probability, and Time's Arrow
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Measure-Theoretic Probability I
Measure-Theoretic Probability I Steven P.Lalley Winter 2017 1 1 Measure Theory 1.1 Why Measure Theory? There are two different views – not necessarily exclusive – on what “probability” means: the subjectivist view and the frequentist view. To the subjectivist, probability is a system of laws that should govern a rational person’s behavior in situations where a bet must be placed (not necessarily just in a casino, but in situations where a decision must be made about how to proceed when only imperfect information about the outcome of the decision is available, for instance, should I allow Dr. Scissorhands to replace my arthritic knee by a plastic joint?). To the frequentist, the laws of probability describe the long- run relative frequencies of different events in “experiments” that can be repeated under roughly identical conditions, for instance, rolling a pair of dice. For the frequentist inter- pretation, it is imperative that probability spaces be large enough to allow a description of an experiment, like dice-rolling, that is repeated infinitely many times, and that the mathematical laws should permit easy handling of limits, so that one can make sense of things like “the probability that the long-run fraction of dice rolls where the two dice sum to 7 is 1/6”. But even for the subjectivist, the laws of probability should allow for description of situations where there might be a continuum of possible outcomes, or pos- sible actions to be taken. Once one is reconciled to the need for such flexibility, it soon becomes apparent that measure theory (the theory of countably additive, as opposed to merely finitely additive measures) is the only way to go. -
An Introduction to Quantum Field Theory
AN INTRODUCTION TO QUANTUM FIELD THEORY By Dr M Dasgupta University of Manchester Lecture presented at the School for Experimental High Energy Physics Students Somerville College, Oxford, September 2009 - 1 - - 2 - Contents 0 Prologue....................................................................................................... 5 1 Introduction ................................................................................................ 6 1.1 Lagrangian formalism in classical mechanics......................................... 6 1.2 Quantum mechanics................................................................................... 8 1.3 The Schrödinger picture........................................................................... 10 1.4 The Heisenberg picture............................................................................ 11 1.5 The quantum mechanical harmonic oscillator ..................................... 12 Problems .............................................................................................................. 13 2 Classical Field Theory............................................................................. 14 2.1 From N-point mechanics to field theory ............................................... 14 2.2 Relativistic field theory ............................................................................ 15 2.3 Action for a scalar field ............................................................................ 15 2.4 Plane wave solution to the Klein-Gordon equation ........................... -
Probability and Statistics Lecture Notes
Probability and Statistics Lecture Notes Antonio Jiménez-Martínez Chapter 1 Probability spaces In this chapter we introduce the theoretical structures that will allow us to assign proba- bilities in a wide range of probability problems. 1.1. Examples of random phenomena Science attempts to formulate general laws on the basis of observation and experiment. The simplest and most used scheme of such laws is: if a set of conditions B is satisfied =) event A occurs. Examples of such laws are the law of gravity, the law of conservation of mass, and many other instances in chemistry, physics, biology... If event A occurs inevitably whenever the set of conditions B is satisfied, we say that A is certain or sure (under the set of conditions B). If A can never occur whenever B is satisfied, we say that A is impossible (under the set of conditions B). If A may or may not occur whenever B is satisfied, then A is said to be a random phenomenon. Random phenomena is our subject matter. Unlike certain and impossible events, the presence of randomness implies that the set of conditions B do not reflect all the necessary and sufficient conditions for the event A to occur. It might seem them impossible to make any worthwhile statements about random phenomena. However, experience has shown that many random phenomena exhibit a statistical regularity that makes them subject to study. For such random phenomena it is possible to estimate the chance of occurrence of the random event. This estimate can be obtained from laws, called probabilistic or stochastic, with the form: if a set of conditions B is satisfied event A occurs m times =) repeatedly n times out of the n repetitions. -
On the Time Evolution of Wavefunctions in Quantum Mechanics C
On the Time Evolution of Wavefunctions in Quantum Mechanics C. David Sherrill School of Chemistry and Biochemistry Georgia Institute of Technology October 1999 1 Introduction The purpose of these notes is to help you appreciate the connection between eigenfunctions of the Hamiltonian and classical normal modes, and to help you understand the time propagator. 2 The Classical Coupled Mass Problem Here we will review the results of the coupled mass problem, Example 1.8.6 from Shankar. This is an example from classical physics which nevertheless demonstrates some of the essential features of coupled degrees of freedom in quantum mechanical problems and a general approach for removing such coupling. The problem involves two objects of equal mass, connected to two different walls andalsotoeachotherbysprings.UsingF = ma and Hooke’s Law( F = −kx) for the springs, and denoting the displacements of the two masses as x1 and x2, it is straightforward to deduce equations for the acceleration (second derivative in time,x ¨1 andx ¨2): k k x −2 x x ¨1 = m 1 + m 2 (1) k k x x − 2 x . ¨2 = m 1 m 2 (2) The goal of the problem is to solve these second-order differential equations to obtain the functions x1(t)andx2(t) describing the motion of the two masses at any given time. Since they are second-order differential equations, we need two initial conditions for each variable, i.e., x1(0), x˙ 1(0),x2(0), andx ˙ 2(0). Our two differential equations are clearly coupled,since¨x1 depends not only on x1, but also on x2 (and likewise forx ¨2). -
An S-Matrix for Massless Particles Arxiv:1911.06821V2 [Hep-Th]
An S-Matrix for Massless Particles Holmfridur Hannesdottir and Matthew D. Schwartz Department of Physics, Harvard University, Cambridge, MA 02138, USA Abstract The traditional S-matrix does not exist for theories with massless particles, such as quantum electrodynamics. The difficulty in isolating asymptotic states manifests itself as infrared divergences at each order in perturbation theory. Building on insights from the literature on coherent states and factorization, we construct an S-matrix that is free of singularities order-by-order in perturbation theory. Factorization guarantees that the asymptotic evolution in gauge theories is universal, i.e. independent of the hard process. Although the hard S-matrix element is computed between well-defined few particle Fock states, dressed/coherent states can be seen to form as intermediate states in the calculation of hard S-matrix elements. We present a framework for the perturbative calculation of hard S-matrix elements combining Lorentz-covariant Feyn- man rules for the dressed-state scattering with time-ordered perturbation theory for the asymptotic evolution. With hard cutoffs on the asymptotic Hamiltonian, the cancella- tion of divergences can be seen explicitly. In dimensional regularization, where the hard cutoffs are replaced by a renormalization scale, the contribution from the asymptotic evolution produces scaleless integrals that vanish. A number of illustrative examples are given in QED, QCD, and N = 4 super Yang-Mills theory. arXiv:1911.06821v2 [hep-th] 24 Aug 2020 Contents 1 Introduction1 2 The hard S-matrix6 2.1 SH and dressed states . .9 2.2 Computing observables using SH ........................... 11 2.3 Soft Wilson lines . 14 3 Computing the hard S-matrix 17 3.1 Asymptotic region Feynman rules . -
An Introduction to Mathematical Modelling
An Introduction to Mathematical Modelling Glenn Marion, Bioinformatics and Statistics Scotland Given 2008 by Daniel Lawson and Glenn Marion 2008 Contents 1 Introduction 1 1.1 Whatismathematicalmodelling?. .......... 1 1.2 Whatobjectivescanmodellingachieve? . ............ 1 1.3 Classificationsofmodels . ......... 1 1.4 Stagesofmodelling............................... ....... 2 2 Building models 4 2.1 Gettingstarted .................................. ...... 4 2.2 Systemsanalysis ................................. ...... 4 2.2.1 Makingassumptions ............................. .... 4 2.2.2 Flowdiagrams .................................. 6 2.3 Choosingmathematicalequations. ........... 7 2.3.1 Equationsfromtheliterature . ........ 7 2.3.2 Analogiesfromphysics. ...... 8 2.3.3 Dataexploration ............................... .... 8 2.4 Solvingequations................................ ....... 9 2.4.1 Analytically.................................. .... 9 2.4.2 Numerically................................... 10 3 Studying models 12 3.1 Dimensionlessform............................... ....... 12 3.2 Asymptoticbehaviour ............................. ....... 12 3.3 Sensitivityanalysis . ......... 14 3.4 Modellingmodeloutput . ....... 16 4 Testing models 18 4.1 Testingtheassumptions . ........ 18 4.2 Modelstructure.................................. ...... 18 i 4.3 Predictionofpreviouslyunuseddata . ............ 18 4.3.1 Reasonsforpredictionerrors . ........ 20 4.4 Estimatingmodelparameters . ......... 20 4.5 Comparingtwomodelsforthesamesystem . ......... -
The Probability Set-Up.Pdf
CHAPTER 2 The probability set-up 2.1. Basic theory of probability We will have a sample space, denoted by S (sometimes Ω) that consists of all possible outcomes. For example, if we roll two dice, the sample space would be all possible pairs made up of the numbers one through six. An event is a subset of S. Another example is to toss a coin 2 times, and let S = fHH;HT;TH;TT g; or to let S be the possible orders in which 5 horses nish in a horse race; or S the possible prices of some stock at closing time today; or S = [0; 1); the age at which someone dies; or S the points in a circle, the possible places a dart can hit. We should also keep in mind that the same setting can be described using dierent sample set. For example, in two solutions in Example 1.30 we used two dierent sample sets. 2.1.1. Sets. We start by describing elementary operations on sets. By a set we mean a collection of distinct objects called elements of the set, and we consider a set as an object in its own right. Set operations Suppose S is a set. We say that A ⊂ S, that is, A is a subset of S if every element in A is contained in S; A [ B is the union of sets A ⊂ S and B ⊂ S and denotes the points of S that are in A or B or both; A \ B is the intersection of sets A ⊂ S and B ⊂ S and is the set of points that are in both A and B; ; denotes the empty set; Ac is the complement of A, that is, the points in S that are not in A. -
Random Variable = a Real-Valued Function of an Outcome X = F(Outcome)
Random Variables (Chapter 2) Random variable = A real-valued function of an outcome X = f(outcome) Domain of X: Sample space of the experiment. Ex: Consider an experiment consisting of 3 Bernoulli trials. Bernoulli trial = Only two possible outcomes – success (S) or failure (F). • “IF” statement: if … then “S” else “F” • Examine each component. S = “acceptable”, F = “defective”. • Transmit binary digits through a communication channel. S = “digit received correctly”, F = “digit received incorrectly”. Suppose the trials are independent and each trial has a probability ½ of success. X = # successes observed in the experiment. Possible values: Outcome Value of X (SSS) (SSF) (SFS) … … (FFF) Random variable: • Assigns a real number to each outcome in S. • Denoted by X, Y, Z, etc., and its values by x, y, z, etc. • Its value depends on chance. • Its value becomes available once the experiment is completed and the outcome is known. • Probabilities of its values are determined by the probabilities of the outcomes in the sample space. Probability distribution of X = A table, formula or a graph that summarizes how the total probability of one is distributed over all the possible values of X. In the Bernoulli trials example, what is the distribution of X? 1 Two types of random variables: Discrete rv = Takes finite or countable number of values • Number of jobs in a queue • Number of errors • Number of successes, etc. Continuous rv = Takes all values in an interval – i.e., it has uncountable number of values. • Execution time • Waiting time • Miles per gallon • Distance traveled, etc. Discrete random variables X = A discrete rv. -
The Second Law of Thermodynamics Forbids Time Travel
Cosmology, 2014, Vol. 18. 212-222 Cosmology.com, 2014 The Second Law Of Thermodynamics Forbids Time Travel Marko Popovic Department of Chemistry and Biochemistry, Brigham Young University, Provo, UT 84602, USA Abstract Four space-time coordinates define one thermodynamic parameter - Volume. Cell/ organism growth is one of the most fundamental properties of living creatures. The growth is characterized by irreversible change of the volume of the cell/organism. This irreversible change of volume (growth of the cell/organism) makes the system irreversibly change its thermodynamic state. Irreversible change of the systems thermodynamic state makes impossible return to the previous state characterized by state parameters. The impossibility to return the system to the previous state leads to conclusion that even to artificially turn back the arrow of time (as a parameter), it is not possible to turn back the state of the organism and its surroundings. Irreversible change of thermodynamic state of the organism is also consequence of the accumulation of entropy during life. So even if we find the way to turn back the time arrow it is impossible to turn back the state of the thermodynamic system (including organism) because of irreversibility of thermodynamic/ physiologic processes in it. Keywords: Time travel, Entropy, Living Organism, Surroundings, Irreversibility Cosmology, 2014, Vol. 18. 212-222 Cosmology.com, 2014 1. Introduction The idea of time travel has fascinated humanity since ancient times and can be found in texts as old as Mahabharata and Talmud. Later it continued to be developed in literature (i.e. Dickens' “A Christmas Carol”, or Twain's “A Connecticut Yankee in King Arthur's Court”…). -
1 Probabilities
1 Probabilities 1.1 Experiments with randomness We will use the term experiment in a very general way to refer to some process that produces a random outcome. Examples: (Ask class for some first) Here are some discrete examples: • roll a die • flip a coin • flip a coin until we get heads And here are some continuous examples: • height of a U of A student • random number in [0, 1] • the time it takes until a radioactive substance undergoes a decay These examples share the following common features: There is a proce- dure or natural phenomena called the experiment. It has a set of possible outcomes. There is a way to assign probabilities to sets of possible outcomes. We will call this a probability measure. 1.2 Outcomes and events Definition 1. An experiment is a well defined procedure or sequence of procedures that produces an outcome. The set of possible outcomes is called the sample space. We will typically denote an individual outcome by ω and the sample space by Ω. Definition 2. An event is a subset of the sample space. This definition will be changed when we come to the definition ofa σ-field. The next thing to define is a probability measure. Before we can do this properly we need some more structure, so for now we just make an informal definition. A probability measure is a function on the collection of events 1 that assign a number between 0 and 1 to each event and satisfies certain properties. NB: A probability measure is not a function on Ω. -
39 Section J Basic Probability Concepts Before We Can Begin To
Section J Basic Probability Concepts Before we can begin to discuss inferential statistics, we need to discuss probability. Recall, inferential statistics deals with analyzing a sample from the population to draw conclusions about the population, therefore since the data came from a sample we can never be 100% certain the conclusion is correct. Therefore, probability is an integral part of inferential statistics and needs to be studied before starting the discussion on inferential statistics. The theoretical probability of an event is the proportion of times the event occurs in the long run, as a probability experiment is repeated over and over again. Law of Large Numbers says that as a probability experiment is repeated again and again, the proportion of times that a given event occurs will approach its probability. A sample space contains all possible outcomes of a probability experiment. EX: An event is an outcome or a collection of outcomes from a sample space. A probability model for a probability experiment consists of a sample space, along with a probability for each event. Note: If A denotes an event then the probability of the event A is denoted P(A). Probability models with equally likely outcomes If a sample space has n equally likely outcomes, and an event A has k outcomes, then Number of outcomes in A k P(A) = = Number of outcomes in the sample space n The probability of an event is always between 0 and 1, inclusive. 39 Important probability characteristics: 1) For any event A, 0 ≤ P(A) ≤ 1 2) If A cannot occur, then P(A) = 0. -
The Arrow of Time Volume 7 Paul Davies Summer 2014 Beyond Center for Fundamental Concepts in Science, Arizona State University, Journal Homepage P.O
The arrow of time Volume 7 Paul Davies Summer 2014 Beyond Center for Fundamental Concepts in Science, Arizona State University, journal homepage P.O. Box 871504, Tempe, AZ 852871504, USA. www.euresisjournal.org [email protected] Abstract The arrow of time is often conflated with the popular but hopelessly muddled concept of the “flow” or \passage" of time. I argue that the latter is at best an illusion with its roots in neuroscience, at worst a meaningless concept. However, what is beyond dispute is that physical states of the universe evolve in time with an objective and readily-observable directionality. The ultimate origin of this asymmetry in time, which is most famously captured by the second law of thermodynamics and the irreversible rise of entropy, rests with cosmology and the state of the universe at its origin. I trace the various physical processes that contribute to the growth of entropy, and conclude that gravitation holds the key to providing a comprehensive explanation of the elusive arrow. 1. Time's arrow versus the flow of time The subject of time's arrow is bedeviled by ambiguous or poor terminology and the con- flation of concepts. Therefore I shall begin my essay by carefully defining terms. First an uncontentious statement: the states of the physical universe are observed to be distributed asymmetrically with respect to the time dimension (see, for example, Refs. [1, 2, 3, 4]). A simple example is provided by an earthquake: the ground shakes and buildings fall down. We would not expect to see the reverse sequence, in which shaking ground results in the assembly of a building from a heap of rubble.