
Chapter 3 Introduction to stochastic processes In this chapter we review the basic concepts of what a stochastic process is. Our aim is not to be rigorous on the mathematical side but rather to focus on the physical insights behind the concepts. The name “stochastic process” is usually associated to a trajectory in phase space which is random enough to demand a probabilistic description. A paradigmatic example is that of the Brownian motion. 3.1 Brownian motion The botanist Robert Brown discovered in 1827 that particles in a pollen in suspension execute random movements which he even interpreted initially as some sort of life. It is not so well known that L. Boltzmann knew as early as 1896 the reason for this erratic movement when he wrote “... very small particles in a gas execute motions which result from the fact that the pressure on the surface of the particles may fluctuate”. However, it was A. Einstein in 1905 who successfully introduced the first mathematical treatment of the erratic movement of the Brownian particles. Rather than focusing on the (complicated) trajectory of a single particle, Einstein introduced a probabilistic description valid for an ensemble of Brownian particles. First Einstein introduced the concept of a coarse–grained description defined by a time ⌧ such that di↵erent parts of the trajectory separated by a time ⌧ or larger can be considered independent. No attempt is made to characterize the dynamics at a time scale smaller than this coarse– grain time ⌧. Instead we consider snapshots of the system taken at time intervals ⌧ (see Figure 3.1). The second concept, probabilistic in nature, introduced by Einstein is that of the probability density function, f(∆~ ), for the three-dimensional distance ∆~ = (∆x, ∆y, ∆z) traveled by the Brownian particle in a time interval ⌧. One could assume, for instance, a Gaussian form for f(∆~ ), but this it not necessary. In fact, the only assumption one needs about the function f(∆~ ) (besides the general condition of non-negativity and normalization) comes from the fact that the collisions of the fluid molecules and the Brownian particle occur with the same probability in any direction1. The absence of 1We also disregard the e↵ect of gravity in the Brownian particle which would lead to a preferred 44 Introduction to stochastic processes Figure 3.1: Schematic representation of the movement of a Brownian particle preferred directions translates to a symmetry condition for f(∆~ ): f( ∆~ ) = f(∆~ ). (3.1) − − The third step in this description is to consider an ensemble of N Brownian particles in a large enough system. Also we focus on spatial scales which are much larger than the size of a Brownian particle so that we can define a density of particles n(~x , t ) such that n(~x , t )d~x is the number of particles in the interval (~x,~x + d~x) at time t. From the assumption that the parts of the trajectories separated a time interval ⌧ are statistically independent, it follows that the number of particles at location ~x at time t + ⌧ will be given by the number of particles at location ~x ∆~ at time t multiplied by the probability that the particle jumps from ~x ∆~ to ~x which− is f(∆~ ) and integrated for all the possible − ∆~ values: n(~x , t + ⌧)= n(~x ∆~ ,t)f(∆~ ) d∆~ . (3.2) 3 − ZR This is the basic evolution equation for the number density n(~x , t ). From the physical point of view it is a continuity equation whose interpretation is that Brownian particles can not be created nor can they disappear due to the collisions with the fluid molecules. By Taylor expanding the above expression and making use of the symmetry relation eq.(3.1) one gets the di↵usion equation: @n = D 2n, (3.3) @t r direction in the movement. 3.1 Brownian motion 45 where the di↵usion constant D is given in terms of the pdf f(∆~ ) by: 2 1 1 ∆~ D = ∆~ 2f(∆~ )d∆~ =h i (3.4) 2⌧ 2⌧ Z1 If the initial condition is that all N particles are located at the origin, n(~x , t = 0) = Nδ(~x ) the solution of the di↵usion equation is: N ~x 2/4Dt n(~x , t )= e− , (3.5) (4⇡Dt)3/2 from where it follows that the average position of the Brownian particle is x(t) =0 and that the average square position increases linearly with time, namely: h i ~x (t)2 =6Dt. (3.6) h i This prediction has been successfully confirmed in experiments and contributed to the acceptance of the atomic theory. However successful Einstein’s approach was, it is very phenomenological and it can not yield, for instance, an explicit expression that allows the calculation of the di↵usion constant in terms of microscopic quantities. Langevin (1908) initiated a di↵erent ap- proach which, in some way, can be considered complementary of the previous one. In his approach, Langevin focused on the trajectory of a single Brownian particle and wrote down Newton’s equation Force = mass acceleration. The trajectory of the Brownian particle is highly erratic and therefore its⇥ description would demand a peculiar kind of force. Langevin considered two types of forces acting on the Brownian particle: usual friction forces that, according to Stokes law, would be proportional to the velocity, and a sort of “fluctuating” force ⇠~(t) which represents the “erratic” force which comes from the action of the fluid molecules on the Brownian particle. The equation of motion becomes then: d~v m = 6⇡⌘a~v + ⇠~, (3.7) dt − ⌘ is the viscosity coefficient and a is the radius of the Brownian particle (which is assumed to be spherical). Multiplying both sides of eq.(3.7) by ~x one gets m d2~x 2 d~x 2 d~x2 m = 3⇡a⌘ + ~x ⇠~. (3.8) 2 dt2 − dt − dt · ✓ ◆ Langevin made two assumptions about the fluctuating force ⇠~(t): that is has mean 0 (collisions do not push the Brownian particle in any preferred direction) and that it is uncorrelated to the actual position of the Brownian particle (the action of the molecules of fluid on the Brownian particle is the same no matter the location of the Brownian particle): ⇠~(t) =0, h i ~x ⇠~ = ~x ⇠~ =0. (3.9) h · i h i·h i 46 Introduction to stochastic processes Taking averages in Eq. (3.8) with respect to all realizations of the random force ⇠~(t) and using the previous conditions on ⇠~(t) one gets: m d2 ~x 2 d ~x 2 h i = m ~v 3⇡a⌘ h i, (3.10) 2 dt2 h i dt which is an equation for the average square position of the Brownian particle. Langevin assumed that we are now in a regime in which thermal equilibrium between the Brownian particle and the surrounding fluid has been reached. In particular, this implies that, according to the equipartition theorem, the average kinetic energy of the Brownian particle is m~v2/2 =3kT/2 (k is Boltzmann’s constant and T is the fluid temperature). One can nowh solvei Eq. (3.10) and find that, after some transient time, the asymptotic mean square displacement is given by: kT ~x 2 = t (3.11) h i ⇡⌘a This is nothing but Einstein’s di↵usion law, but now we have an explicit expression for the di↵usion coefficient in terms of other macroscopic variables: kT D = (3.12) 6⇡⌘a Langevin’s random force ⇠~(t) is an example of a stochastic process. It is time we proceed to a more precise definition of what a stochastic process is. The natural machinery is that of probability theory. 3.2 Stochastic Processes In Chapter 1 we have introduced the concept of a random variable xˆ resulting from a probabilistic experiment. We now define a stochastic process as a family xˆ(t) of ran- dom variables depending on some continuous real parameter t. In most applications, t is a physical time and the stochastic process can be thought as performing multiple probabilistic experiments one at each time instant. The trajectory followed by the sys- tem depends on the outcome of each probabilistic experiment. As a consequence the knowledge of the initial condition x0 at time t0 is not enough to determine the position in phase space of the system at a later time t1. Instead, the trajectory, and therefore the final state of the system, acquires a probabilistic nature. To fully determine the final state, it is also necessary to know the outcome of all the successive probabilistic experiments between the initial time t0 and the final time t1. In fact, each possible set of successive outcomes determines a possible trajectory for the system, all starting at x0 at time t0 but ending at di↵erent locations at time t1. The stochastic process can be seen as the collection of all these possible trajectories. Probably the most well–known example of a stochastic process is that of the random walk. The probabilistic experiment is now a series of binary results representing, for instance, the outcome of repeatedly tossing a coin: (0, 0, 1, 1, 0, 1, 0, 0, 0, 1, 1, 1, 1, 0, 0, 1,...) 3.2 Stochastic Processes 47 x(t)! 300 200 100 0 -100 4 2 -200 0 -2 -4 -300 -6 -8 -10 -400 -12 -14 -16 -500 20000 20010 20020 20030 20040 20050 0 20000 40000 60000 80000 100000" t! Figure 3.2: Example of a random walk trajectory. We have taken ⌧ = a =1and plotted the resulting trajectory after a large number, 105, of steps. In the insert, we see the fine detail with the discrete jumps occurring at times multiples of ⌧.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages21 Page
-
File Size-