Poisson Line Processes and Spatial Transportation Networks

Total Page:16

File Type:pdf, Size:1020Kb

Poisson Line Processes and Spatial Transportation Networks Introduction Useful references Chronology Introduction Useful references Chronology Syllabus of lectures Part1 Stochastic geometry: a brief introduction: Poisson Line Processes and The basic building blocks of stochastic geometry. Spatial Transportation Networks Part2 Poisson line processes and Network Efficiency: Curious results arising from careful investigation of measures of effectiveness of networks. Wilfrid S. Kendall Part3 Traffic flow in a Poissonian city: [email protected] Using Poisson line processes to model traffic intensity in Department of Statistics, University of Warwick a random city. Part4 Scale-invariant random spatial networks (SIRSN): 21st-22nd June 2018 Novel random metric spaces arising from the methods used to analyze effectiveness of networks. Support: Turing, EPSRC K031066, K013939, INI “Random Geometry”. Part5 Random flights on the SIRSN. How can we move about on a SIRSN modelled on Poisson line processes? 1 2 Introduction Useful references Chronology Introduction Useful references Chronology Useful references for stochastic geometry A short chronology of stochastic geometry (The study of random pattern) Kingman (1993): concise and elegant 1. Georges-Louis Leclerc, Comte de Buffon introduction to Poisson processes; (7 September, 1707 – 16 April, 1788); Stoyan, WSK, and Mecke (1995): now in an 2. Władysław Hugo Dionizy Steinhaus rd extended 3 edition (Chiu, Stoyan, WSK, and (14 January, 1887 – 25 February, 1972); Mecke, 2013); 3. Luís Antoni Santaló Sors Schneider and Weil (2008): magisterial (9 October, 1911 – 22 November, 2001); introduction to the links between stochastic and 4. Rollo Davidson integral geometry; (8 October, 1944 – 29 July, 1970); WSK and Molchanov (2010): collection of essays 5. D.G. Kendall and K. Krickeberg coined the phrase on developments in stochastic geometry. “stochastic geometry” when preparing an Oberwolfach workshop, 1969 (also Frisch and Hammersley, 1963); Last and Penrose (2017): more on Poisson point processes. 6. Mathematical morphology (G. Matheron and J. Serra); 7. Point processes arising from queueing theory Molchanov (1996b): Boolean models, a way to (J. Kerstan, K. Matthes, . ); build random sets from Poisson point processes. 8. Combinatorial geometry (R.V. Ambartzumian). 3 4 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Lecture I Poisson point processes Humans have evolved to recognize patterns . Poisson Point Processes and Friends A quiz about random pattern Definitions of Poisson point processes Palm theory for Poisson point processes Examples of Poisson point processes Marked Poisson point processes The Boolean model . we are less good at recognizing complete randomness. 5 6 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Puzzle Constructive definition of Poisson point process 6 Using 10 years of accumulated evolutionary programming . Definition: unit intensity Poisson point pattern in a unit square . which of these patterns fail to be completely random? 1. Draw a Poisson(1) random variable N; 2. Conditional on N n, plant n independent random = points uniformly in [0, 1]2. This can be viewed as a binomial point process of N points, with total number of points N being Poisson-randomized. Replace unit square [0, 1]2 by region R, and Poisson(1) by Poisson(area(R)), to produce a unit intensity Poisson point pattern in a more general planar region R. If R is of infinite area, break it up into disjoint sub-regions of finite area and work independently in each sub-region. Replace Poisson(area(R)) by Poisson(λ area(R)) to produce × a Poisson point pattern of intensity λ in general region R. 7 8 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Axiomatic definition of Poisson point process General point processes Definition: Poisson point (alternative abstract definition) A (stationary) point pattern is Poisson (“truly random”) if: 1. We can think of a point process Φ as a random measure taking non-negative integer values (attention focussed 1. number of points in any prescribed region has a Poisson on Φ(A) the number of points in a Borel set A); distribution with mean defined by intensity measure ν (NB: ν is diffuse). EG: ν(region) λ (area of region); 2. or (equivalently) as a random closed set which is locally = × finite (attention focussed on whether the set hits 2. numbers of points in disjoint regions are independent. compact test sets K, so whether Φ(K) > 0. Basic construction: each very small region independently has 3. The variety of general point processes is huge. a very small chance of holding a point. Nevertheless, the distribution of a point process is Remark: Powerful theoretical result determined by its system of void probabilities Properties1 and2 are both automatically implied if the void P [Φ(K) 0] for all compact K. = probability of a measurable region failing to hold any points 4. Indeed this holds more generally for random closed sets at all is exp( ν(region)) (consequence of result of Choquet). − Ξ: the distribution of Ξ is determined by the system of k ν(A) Computations: P [N(A) k] (ν(A)) e− , E [N(A)] ν(A). void probabilities P [Ξ K ] for all compact K. = = k! = ∩ = ∅ Much more theory: Kingman (1993), Chiu et al. (2013). 9 10 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Palm distributions (I) Palm distributions (II) Consider a general point process Φ on R2 with (σ -finite) intensity measure ν. How to condition on specified x Φ? ∈ 1. Intensity measure Definition: Palm distribution of a general point process ν(B) E [# x B Φ ] E x Φ I [x B] = { ∈ ∩ } = ∈ ∈ = If Φ is a general point process on R2 with σ -finite intensity E x B Φ 1 ; P ∈ ∩ measure ν, then its Palm distribution P at x R2 satisfies 2. LetP be some (measurable) set of possible point x ∈ Y patterns φ (EG: K φ : φ(K) for given compact K). E h(x, Φ) h(x, φ)Px( d φ)ν( d x) , 2 Y = Y = { = ∅} = ZR Z Campbell measure (B ) E x B Φ I [Φ ] ; xXΦ C × Y = ∈ ∩ ∈ Y ∈ 3. Radon-Nikodym theorem: (B P) P ( )ν( d x) ; C × Y = B x Y for all bounded measurable h(x, φ), for x R2 and φ a loc- ∈ 4. Use theory of regular conditional probabilities:R ally finite point pattern on R2. P ( ) can be made into a probability measure; x · 5. We interpret P ( ) as the Palm distribution at x. x · 6. The reduced Palm distribution is obtained by deleting the point x from the Palm distribution. 11 12 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Palm distribution of a Poisson process Application: When is a point process Poisson? Theorem: Slivnyak Is the randomness of a pattern “unstructured”? The Palm distribution of a planar Poisson point process Φ at Use statistical tests for x R2 is the superposition δ P( d φ), namely Φ with added ∈ x ∗ Poisson distribution of point at x. point counts; Proof. Deciding whether a pattern Consider an arbitrary (compact) set K R2 and a bounded is completely random is ⊂ Borel set B R2. relatively easy (answer is ⊂ usually no!). Deciding what Let VK be the set of patterns placing no points in K. sort of pattern can be hard. Consider (B, VK) E x B I [Φ K ] . C = ∈ ∩ = ∅ But (B K, VK) 0, whileP (B K, VK) B K P(VK)ν( d x), Compare Ripley (1977) C ∩ = C \ = \ when P is the original Poisson distributionR (independence of (a) First-contact distribution Poisson process in disjoint regions). H(r ) of distance from fixed This agrees with the results obtained by replacing location to nearest point, ( d x, d ξ) by ν( d x) (δ P( d ξ)). (b) Nearest-neighbour C × x ∗ Choquet’s capacitability theorem implies measures agree. Earthquake locations distribution . in Western Australia. D(r ) 13 14 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Example: Alpha particles Mephistopheles and the bus company Centres of traces of α-particles on a detector in a 60 60 × Classic bus paradox: mean time between Poisson buses is square (unit of 2µ m). twice the mean length of interval between bus arrivals. This follows immediately from Slivnyak’s theorem! For renewal point processes the picture can be stranger. 1. The aspirant devil Mephistopheles is forced to take a job with Hades Regional Bus Company. 2. He is responsible for operation of the No12 bus service between city centre and university campus. 3. Contract requires him to deliver mean time between bus arrivals of 10 minutes. 4. How bad can he make the service while still (statistically speaking) fulfilling the contract? In mean-value terms, Mephistopheles can make matters Locations of centres of traces (Stoyan et al., 1995, Figure 2.5ff). infinitely bad . 15 16 Quiz Definitions Palm Examples Marks Boolean model Quiz Definitions Palm Examples Marks Boolean model Marked point processes Palm distribution for marks (sketch) Many constructions and applications use the notion that Independently marked Poisson point process Ψ, typical points of the pattern carry “marks”. marked point [x, m], independent mark distribution Uniform 0 1 . 1. In applications the points of a pattern are often [ , ] accompanied by extra measurements (diameters of We want to impose (Palm) conditioning by requiring a trees, yields of ore deposits, . ). point to lie at x, but without conditioning the corresponding mark m. 2.A marked point process on R2 can be viewed as a point Distribution of Ψ can be obtained by process on R2 M, where M is mark space. × 1. Constructing un-marked point process Φ from Ψ; 3. Under stationarity we can factorize intensity measure as 2.
Recommended publications
  • Poisson Representations of Branching Markov and Measure-Valued
    The Annals of Probability 2011, Vol. 39, No. 3, 939–984 DOI: 10.1214/10-AOP574 c Institute of Mathematical Statistics, 2011 POISSON REPRESENTATIONS OF BRANCHING MARKOV AND MEASURE-VALUED BRANCHING PROCESSES By Thomas G. Kurtz1 and Eliane R. Rodrigues2 University of Wisconsin, Madison and UNAM Representations of branching Markov processes and their measure- valued limits in terms of countable systems of particles are con- structed for models with spatially varying birth and death rates. Each particle has a location and a “level,” but unlike earlier con- structions, the levels change with time. In fact, death of a particle occurs only when the level of the particle crosses a specified level r, or for the limiting models, hits infinity. For branching Markov pro- cesses, at each time t, conditioned on the state of the process, the levels are independent and uniformly distributed on [0,r]. For the limiting measure-valued process, at each time t, the joint distribu- tion of locations and levels is conditionally Poisson distributed with mean measure K(t) × Λ, where Λ denotes Lebesgue measure, and K is the desired measure-valued process. The representation simplifies or gives alternative proofs for a vari- ety of calculations and results including conditioning on extinction or nonextinction, Harris’s convergence theorem for supercritical branch- ing processes, and diffusion approximations for processes in random environments. 1. Introduction. Measure-valued processes arise naturally as infinite sys- tem limits of empirical measures of finite particle systems. A number of ap- proaches have been developed which preserve distinct particles in the limit and which give a representation of the measure-valued process as a transfor- mation of the limiting infinite particle system.
    [Show full text]
  • Poisson Processes Stochastic Processes
    Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written as −λt f (t) = λe 1(0;+1)(t) We summarize the above by T ∼ exp(λ): The cumulative distribution function of a exponential random variable is −λt F (t) = P(T ≤ t) = 1 − e 1(0;+1)(t) And the tail, expectation and variance are P(T > t) = e−λt ; E[T ] = λ−1; and Var(T ) = E[T ] = λ−2 The exponential random variable has the lack of memory property P(T > t + sjT > t) = P(T > s) Exponencial races In what follows, T1;:::; Tn are independent r.v., with Ti ∼ exp(λi ). P1: min(T1;:::; Tn) ∼ exp(λ1 + ··· + λn) . P2 λ1 P(T1 < T2) = λ1 + λ2 P3: λi P(Ti = min(T1;:::; Tn)) = λ1 + ··· + λn P4: If λi = λ and Sn = T1 + ··· + Tn ∼ Γ(n; λ). That is, Sn has probability density function (λs)n−1 f (s) = λe−λs 1 (s) Sn (n − 1)! (0;+1) The Poisson Process as a renewal process Let T1; T2;::: be a sequence of i.i.d. nonnegative r.v. (interarrival times). Define the arrival times Sn = T1 + ··· + Tn if n ≥ 1 and S0 = 0: The process N(t) = maxfn : Sn ≤ tg; is called Renewal Process. If the common distribution of the times is the exponential distribution with rate λ then process is called Poisson Process of with rate λ. Lemma. N(t) ∼ Poisson(λt) and N(t + s) − N(s); t ≥ 0; is a Poisson process independent of N(s); t ≥ 0 The Poisson Process as a L´evy Process A stochastic process fX (t); t ≥ 0g is a L´evyProcess if it verifies the following properties: 1.
    [Show full text]
  • POISSON PROCESSES 1.1. the Rutherford-Chadwick-Ellis
    POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS 1.1. The Rutherford-Chadwick-Ellis Experiment. About 90 years ago Ernest Rutherford and his collaborators at the Cavendish Laboratory in Cambridge conducted a series of pathbreaking experiments on radioactive decay. In one of these, a radioactive substance was observed in N = 2608 time intervals of 7.5 seconds each, and the number of decay particles reaching a counter during each period was recorded. The table below shows the number Nk of these time periods in which exactly k decays were observed for k = 0,1,2,...,9. Also shown is N pk where k pk = (3.87) exp 3.87 =k! {− g The parameter value 3.87 was chosen because it is the mean number of decays/period for Rutherford’s data. k Nk N pk k Nk N pk 0 57 54.4 6 273 253.8 1 203 210.5 7 139 140.3 2 383 407.4 8 45 67.9 3 525 525.5 9 27 29.2 4 532 508.4 10 16 17.1 5 408 393.5 ≥ This is typical of what happens in many situations where counts of occurences of some sort are recorded: the Poisson distribution often provides an accurate – sometimes remarkably ac- curate – fit. Why? 1.2. Poisson Approximation to the Binomial Distribution. The ubiquity of the Poisson distri- bution in nature stems in large part from its connection to the Binomial and Hypergeometric distributions. The Binomial-(N ,p) distribution is the distribution of the number of successes in N independent Bernoulli trials, each with success probability p.
    [Show full text]
  • On Sampling from the Multivariate T Distribution by Marius Hofert
    CONTRIBUTED RESEARCH ARTICLES 129 On Sampling from the Multivariate t Distribution by Marius Hofert Abstract The multivariate normal and the multivariate t distributions belong to the most widely used multivariate distributions in statistics, quantitative risk management, and insurance. In contrast to the multivariate normal distribution, the parameterization of the multivariate t distribution does not correspond to its moments. This, paired with a non-standard implementation in the R package mvtnorm, provides traps for working with the multivariate t distribution. In this paper, common traps are clarified and corresponding recent changes to mvtnorm are presented. Introduction A supposedly simple task in statistics courses and related applications is to generate random variates from a multivariate t distribution in R. When teaching such courses, we found several fallacies one might encounter when sampling multivariate t distributions with the well-known R package mvtnorm; see Genz et al.(2013). These fallacies have recently led to improvements of the package ( ≥ 0.9-9996) which we present in this paper1. To put them in the correct context, we first address the multivariate normal distribution. The multivariate normal distribution The multivariate normal distribution can be defined in various ways, one is with its stochastic represen- tation X = m + AZ, (1) where Z = (Z1, ... , Zk) is a k-dimensional random vector with Zi, i 2 f1, ... , kg, being independent standard normal random variables, A 2 Rd×k is an (d, k)-matrix, and m 2 Rd is the mean vector. The covariance matrix of X is S = AA> and the distribution of X (that is, the d-dimensional multivariate normal distribution) is determined solely by the mean vector m and the covariance matrix S; we can thus write X ∼ Nd(m, S).
    [Show full text]
  • Introduction to Lévy Processes
    Introduction to L´evyprocesses Graduate lecture 22 January 2004 Matthias Winkel Departmental lecturer (Institute of Actuaries and Aon lecturer in Statistics) 1. Random walks and continuous-time limits 2. Examples 3. Classification and construction of L´evy processes 4. Examples 5. Poisson point processes and simulation 1 1. Random walks and continuous-time limits 4 Definition 1 Let Yk, k ≥ 1, be i.i.d. Then n X 0 Sn = Yk, n ∈ N, k=1 is called a random walk. -4 0 8 16 Random walks have stationary and independent increments Yk = Sk − Sk−1, k ≥ 1. Stationarity means the Yk have identical distribution. Definition 2 A right-continuous process Xt, t ∈ R+, with stationary independent increments is called L´evy process. 2 Page 1 What are Sn, n ≥ 0, and Xt, t ≥ 0? Stochastic processes; mathematical objects, well-defined, with many nice properties that can be studied. If you don’t like this, think of a model for a stock price evolving with time. There are also many other applications. If you worry about negative values, think of log’s of prices. What does Definition 2 mean? Increments , = 1 , are independent and Xtk − Xtk−1 k , . , n , = 1 for all 0 = . Xtk − Xtk−1 ∼ Xtk−tk−1 k , . , n t0 < . < tn Right-continuity refers to the sample paths (realisations). 3 Can we obtain L´evyprocesses from random walks? What happens e.g. if we let the time unit tend to zero, i.e. take a more and more remote look at our random walk? If we focus at a fixed time, 1 say, and speed up the process so as to make n steps per time unit, we know what happens, the answer is given by the Central Limit Theorem: 2 Theorem 1 (Lindeberg-L´evy) If σ = V ar(Y1) < ∞, then Sn − (Sn) √E → Z ∼ N(0, σ2) in distribution, as n → ∞.
    [Show full text]
  • EFFICIENT ESTIMATION and SIMULATION of the TRUNCATED MULTIVARIATE STUDENT-T DISTRIBUTION
    Efficient estimation and simulation of the truncated multivariate student-t distribution Zdravko I. Botev, Pierre l’Ecuyer To cite this version: Zdravko I. Botev, Pierre l’Ecuyer. Efficient estimation and simulation of the truncated multivariate student-t distribution. 2015 Winter Simulation Conference, Dec 2015, Huntington Beach, United States. hal-01240154 HAL Id: hal-01240154 https://hal.inria.fr/hal-01240154 Submitted on 8 Dec 2015 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Proceedings of the 2015 Winter Simulation Conference L. Yilmaz, W. K V. Chan, I. Moon, T. M. K. Roeder, C. Macal, and M. Rosetti, eds. EFFICIENT ESTIMATION AND SIMULATION OF THE TRUNCATED MULTIVARIATE STUDENT-t DISTRIBUTION Zdravko I. Botev Pierre L’Ecuyer School of Mathematics and Statistics DIRO, Universite´ de Montreal The University of New South Wales C.P. 6128, Succ. Centre-Ville Sydney, NSW 2052, AUSTRALIA Montreal´ (Quebec),´ H3C 3J7, CANADA ABSTRACT We propose an exponential tilting method for exact simulation from the truncated multivariate student-t distribution in high dimensions as an alternative to approximate Markov Chain Monte Carlo sampling. The method also allows us to accurately estimate the probability that a random vector with multivariate student-t distribution falls in a convex polytope.
    [Show full text]
  • Applied Mathematics 1
    Applied Mathematics 1 MATH 0180 Intermediate Calculus 1 Applied Mathematics MATH 0520 Linear Algebra 2 1 APMA 0350 Applied Ordinary Differential Equations 2 & APMA 0360 and Applied Partial Differential Equations Chair I 3 4 Bjorn Sandstede Select one course on programming from the following: 1 APMA 0090 Introduction to Mathematical Modeling Associate Chair APMA 0160 Introduction to Computing Sciences Kavita Ramanan CSCI 0040 Introduction to Scientific Computing and The Division of Applied Mathematics at Brown University is one of Problem Solving the most prominent departments at Brown, and is also one of the CSCI 0111 Computing Foundations: Data oldest and strongest of its type in the country. The Division of Applied CSCI 0150 Introduction to Object-Oriented Mathematics is a world renowned center of research activity in a Programming and Computer Science wide spectrum of traditional and modern mathematics. It explores the connections between mathematics and its applications at both CSCI 0170 Computer Science: An Integrated the research and educational levels. The principal areas of research Introduction activities are ordinary, functional, and partial differential equations: Five additional courses, of which four should be chosen from 5 stochastic control theory; applied probability, statistics and stochastic the 1000-level courses taught by the Division of Applied systems theory; neuroscience and computational molecular biology; Mathematics. APMA 1910 cannot be used as an elective. numerical analysis and scientific computation; and the mechanics of Total Credits 10 solids, materials science and fluids. The effort in virtually all research 1 ranges from applied and algorithmic problems to the study of fundamental Substitution of alternate courses for the specific requirements is mathematical questions.
    [Show full text]
  • Spatio-Temporal Cluster Detection and Local Moran Statistics of Point Processes
    Old Dominion University ODU Digital Commons Mathematics & Statistics Theses & Dissertations Mathematics & Statistics Spring 2019 Spatio-Temporal Cluster Detection and Local Moran Statistics of Point Processes Jennifer L. Matthews Old Dominion University Follow this and additional works at: https://digitalcommons.odu.edu/mathstat_etds Part of the Applied Statistics Commons, and the Biostatistics Commons Recommended Citation Matthews, Jennifer L.. "Spatio-Temporal Cluster Detection and Local Moran Statistics of Point Processes" (2019). Doctor of Philosophy (PhD), Dissertation, Mathematics & Statistics, Old Dominion University, DOI: 10.25777/3mps-rk62 https://digitalcommons.odu.edu/mathstat_etds/46 This Dissertation is brought to you for free and open access by the Mathematics & Statistics at ODU Digital Commons. It has been accepted for inclusion in Mathematics & Statistics Theses & Dissertations by an authorized administrator of ODU Digital Commons. For more information, please contact [email protected]. ABSTRACT Approved for public release; distribution is unlimited SPATIO-TEMPORAL CLUSTER DETECTION AND LOCAL MORAN STATISTICS OF POINT PROCESSES Jennifer L. Matthews Commander, United States Navy Old Dominion University, 2019 Director: Dr. Norou Diawara Moran's index is a statistic that measures spatial dependence, quantifying the degree of dispersion or clustering of point processes and events in some location/area. Recognizing that a single Moran's index may not give a sufficient summary of the spatial autocorrelation measure, a local
    [Show full text]
  • Stochastic Simulation APPM 7400
    Stochastic Simulation APPM 7400 Lesson 1: Random Number Generators August 27, 2018 Lesson 1: Random Number Generators Stochastic Simulation August27,2018 1/29 Random Numbers What is a random number? Lesson 1: Random Number Generators Stochastic Simulation August27,2018 2/29 Random Numbers What is a random number? “You mean like... 5?” Lesson 1: Random Number Generators Stochastic Simulation August27,2018 2/29 Random Numbers What is a random number? “You mean like... 5?” Formal definition from probability: A random number is a number uniformly distributed between 0 and 1. Lesson 1: Random Number Generators Stochastic Simulation August27,2018 2/29 Random Numbers What is a random number? “You mean like... 5?” Formal definition from probability: A random number is a number uniformly distributed between 0 and 1. (It is a realization of a uniform(0,1) random variable.) Lesson 1: Random Number Generators Stochastic Simulation August27,2018 2/29 Random Numbers Here are some realizations: 0.8763857 0.2607807 0.7060687 0.0826960 Density 0.7569021 . 0 2 4 6 8 10 0.0 0.2 0.4 0.6 0.8 1.0 sample Lesson 1: Random Number Generators Stochastic Simulation August27,2018 3/29 Random Numbers Here are some realizations: 0.8763857 0.2607807 0.7060687 0.0826960 Density 0.7569021 . 0 2 4 6 8 10 0.0 0.2 0.4 0.6 0.8 1.0 sample Lesson 1: Random Number Generators Stochastic Simulation August27,2018 3/29 Random Numbers Here are some realizations: 0.8763857 0.2607807 0.7060687 0.0826960 Density 0.7569021 .
    [Show full text]
  • Alireza Sheikh-Zadeh, Ph.D
    Alireza Sheikh-Zadeh, Ph.D. Assistant Professor of Practice in Data Science Rawls College of Business, Texas Tech University Area of Information Systems and Quantitative Sciences (ISQS) 703 Flint Ave, Lubbock, TX 79409 (806) 834-8569, [email protected] EDUCATION Ph.D. in Industrial Engineering, 2017 Industrial Engineering Department at the University of Arkansas, Fayetteville, AR Dissertation: “Developing New Inventory Segmentation Methods for Large-Scale Multi-Echelon In- ventory Systems” (Advisor: Manuel D. Rossetti) M.Sc. in Industrial Engineering (with concentration in Systems Management), 2008 Industrial Engineering & Systems Management Department at Tehran Polytechnic, Tehran, Iran Thesis: “System Dynamics Modeling for Study and Design of Science and Technology Parks for The Development of Deprived Regions” (Advisor: Reza Ramazani) AREA of INTEREST Data Science & Machine Learning Supply Chain Analytics Applied Operations Research System Dynamics Simulation and Stochastic Modeling Logistic Contracting PUBLICATIONS Journal Papers (including under review papers) • Sheikhzadeh, A., & Farhangi, H., & Rossetti, M. D. (under review), ”Inventory Grouping and Sensitivity Analysis in Large-Scale Spare Part Systems”, submitted work. • Sheikhzadeh, A., Rossetti, M. D., & Scott, M. (2nd review), ”Clustering, Aggregation, and Size-Reduction for Large-Scale Multi-Echelon Spare-Part Replenishment Systems,” Omega: The International Journal of Management Science. • Sheikhzadeh, A., & Rossetti, M. D. (4th review), ”Classification Methods for Problem Size Reduction in Spare Part Provisioning,” International Journal of Production Economics. • Al-Rifai, M. H., Rossetti, M. D., & Sheikhzadeh, A. (2016), ”A Heuristic Optimization Algorithm for Two- Echelon (r; Q) Inventory Systems with Non-Identical Retailers,” International Journal of Inventory Re- search. • Karimi-Nasab, M., Bahalke, U., Feili, H. R., Sheikhzadeh, A., & Dolatkhahi, K.
    [Show full text]
  • Multivariate Stochastic Simulation with Subjective Multivariate Normal
    MULTIVARIATE STOCHASTIC SIMULATION WITH SUBJECTIVE MULTIVARIATE NORMAL DISTRIBUTIONS1 Peter J. Ince2 and Joseph Buongiorno3 Abstract.-In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectivelyassessed multivariate normal distributions. The method requires subjective estimates of the 99-percent confidence interval for the expected value of each random variable and of the partial correlations among the variables. The technique can be used to generate pseudorandom data corresponding to the specified distribution. If the subjective parameters do not yield a positive definite covariance matrix, the technique determines minimal adjustments in variance assumptions needed to restore positive definiteness. The method is validated and then applied to a capital investment simulation for a new papermaking technology.In that example, with ten correlated random variables, no significant difference was detected between multivariate stochastic simulation results and results that ignored the correlation. In general, however, data correlation could affect results of stochastic simulation, as shown by the validation results. INTRODUCTION MONTE CARLO TECHNIQUE Generally, a mathematical model is used in stochastic The Monte Carlo simulation technique utilizes three simulation studies. In addition to randomness, correlation may essential elements: (1) a mathematical model to calculate a exist among the variables or parameters of such models.In the discrete numerical result or outcome as a function of one or case of forest ecosystems, for example, growth can be more discrete variables, (2) a sequence of random (or influenced by correlated variables, such as temperatures and pseudorandom) numbers to represent random probabilities, and precipitations.
    [Show full text]
  • Stochastic Simulation
    Agribusiness Analysis and Forecasting Stochastic Simulation Henry Bryant Texas A&M University Henry Bryant (Texas A&M University) Agribusiness Analysis and Forecasting 1 / 19 Stochastic Simulation In economics we use simulation because we can not experiment on live subjects, a business, or the economy without injury. In other fields they can create an experiment Health sciences they feed (or treat) lots of lab rats on different chemicals to see the results. Animal science researchers feed multiple pens of steers, chickens, cows, etc. on different rations. Engineers run a motor under different controlled situations (temp, RPMs, lubricants, fuel mixes). Vets treat different pens of animals with different meds. Agronomists set up randomized block treatments for a particular seed variety with different fertilizer levels. Henry Bryant (Texas A&M University) Agribusiness Analysis and Forecasting 2 / 19 Probability Distributions Parametric and Non-Parametric Distributions Parametric Dist. have known and well defined parameters that force their shapes to known patterns. Normal Distribution - Mean and Standard Deviation. Uniform - Minimum and Maximum Bernoulli - Probability of true Beta - Alpha, Beta, Minimum, Maximum Non-Parametric Distributions do not have pre-set shapes based on known parameters. The parameters are estimated each time to make the shape of the distribution fit the data. Empirical { Actual Observations and their Probabilities. Henry Bryant (Texas A&M University) Agribusiness Analysis and Forecasting 3 / 19 Typical Problem for Risk Analysis We have a stochastic variable that needs to be included in a business model. For example: Price forecast has residuals we could not explain and they are the stochastic component we need to simulate.
    [Show full text]