The Pennsylvania State University the Graduate School Eberly College of Science

Total Page:16

File Type:pdf, Size:1020Kb

The Pennsylvania State University the Graduate School Eberly College of Science The Pennsylvania State University The Graduate School Eberly College of Science STUDIES ON THE LOCAL TIMES OF DISCRETE-TIME STOCHASTIC PROCESSES A Dissertation in Mathematics by Xiaofei Zheng c 2017 Xiaofei Zheng Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy August 2017 The dissertation of Xiaofei Zheng was reviewed and approved∗ by the following: Manfred Denker Professor of Mathematics Dissertation Adviser Chair of Committee Alexei Novikov Professor of Mathematics Anna Mazzucato Professor of Mathematics Zhibiao Zhao Associate Professor of Statistics Svetlana Katok Professor of Mathematics Director of Graduate Studies ∗Signatures are on file in the Graduate School. Abstract This dissertation investigates the limit behaviors of the local times `(n; x) of the n Pn 1 partial sum fSng of stationary processes fφ ◦ T g: `(n; x) = i=1 fSi=xg. Under the conditional local limit theorem assumption: n kn BnP (Sn = knjT (·) = !) ! g(κ) if ! κ, P − a:s:; Bn we show that the limiting distribution of the local time is the Mittag-Leffler distri- bution when the state space of the stationary process is Z. The method is from the infinite ergodic theory of dynamic systems. We also prove that the discrete-time fractional Brownian motion (dfBm) admits a conditional local limit theorem and the local time of dfBm is closely related to but different from the Mittag-Leffler dis- tribution. We also prove that the local time of certain stationary processes satisfies an almost sure central limit theorem (ASCLT) under the additional assumption that the characteristic operator has a spectral gap. iii Table of Contents Acknowledgments vi Chapter 1 Introduction and Overview 1 1.0.1 Brownian local time . 1 1.0.2 Local time of discrete-time stochastic processes . 4 1.0.3 Connection between the Brownian local time and the local times of discrete-time processes . 9 Chapter 2 Local Limit Theorems 11 2.1 Motivation . 11 2.2 Local limit theorems for independent and identically distributed random variables . 12 2.3 Local limit theorems for Markov chains . 13 2.4 Conditional local limit theorems for stationary processes . 14 2.5 Conditional local limit theorem for discrete-time fractional Brown- ian motion . 19 2.5.1 Proof of the conditional local limit theorem . 20 2.5.2 Estimate of the variance . 24 2.5.3 Estimate of the mean . 31 Chapter 3 Limiting Distributions of Local Times 33 3.1 Local times of random walks . 33 3.2 Occupation times of Markov chains . 38 3.3 Ergodic sums of infinite measure preserving transformation . 44 iv 3.4 Asymptotic distribution of the local times `n of stationary processes with conditional local limit theorems . 47 3.5 Limit theorems of local times of discrete-time fractional Brownian motion . 59 3.5.1 Occupation times of discrete-time fractional Brownian motions 60 3.5.2 Occupation times of continuous fractional Brownian motions 63 Chapter 4 Almost Sure Central Limit Theorems 65 4.1 Almost sure central limit theorems for local times of random walks 65 4.2 Almost sure central limit theorem for stationary processes . 67 4.3 Proof of almost sure central limit theorem (ASCLT) . 69 4.3.1 Proof of Theorem 4.2.2 . 69 4.3.2 Proof of Proposition 4.3.1 . 70 4.4 Transfer operators . 74 4.5 Bounds of local times of stationary processes . 78 Chapter 5 Conclusion and Open Questions 80 Bibliography 82 v Acknowledgments Over the past five years I have received support and encouragement from a great number of individuals. I must first thank my adviser, Professor Manfred Denker, for his continuous guidance, endless encouragement and generous help during my graduate study and research. This dissertation could not have been finished without his advices and support. His guidance and friendship have made my graduate study a thoughtful and rewarding journey. I would also like to thank my dissertation committee members: Alexei Novikov, Anna Mazzucato and Zhibao Zhao for generously offering their time, insightful comments and support. I learned fractional Brownian motion from Professor Novikov and benefited a lot from his precious guidance and endless patience. I own my thanks to Professor Mazzucato for her invaluable advices as my mentor when I first came to Penn State. I am grateful to Professor Zhao for his valuable time. I offer my thanks to Professor Svetlana Katok for providing me the oppor- tunity to study in the Ph.D. program. I also thank the staffs of the Department of Mathematics for their kindly assistance. I am most grateful and indebted to my parents and the rest of my family for their unconditional love. Lastly, I must thank Changguang Dong for his unwavering love, patience and support. There are certainly many others who deserve mentioning to whom I offer a simple message: Thank You! vi Chapter 1 Introduction and Overview The local time of a continuous-time process is a stochastic process associated with an underlying stochastic process such as a Brownian motion, a Markov process, a diffusion process and so on, that characterizes the amount of time a particle has spent at a given level. It provides a very fine description of the sample paths of the underlying process. While the local time of a discrete-time process measures how often a state is visited and it is a refinement of the notion of recurrence. The local times in these two cases show many similar properties [2] and are closely connected by the invariance principle. 1.0.1 Brownian local time The notion of local time of a Brownian motion was first introduced by Le´vyin 1948 [40]. His contribution to the deep properties of the local time of Brownian motions laid the foundation of the theory of local times of stochastic processes. And later the theory was further developed by Trotter, Knight, Ray, It^o,and McKean, etc. Let fW (s); s ≥ 0g be a 1-dimensional Brownian motion. The occupation time R t 1 of a set A ⊂ R at time t is defined to be µt(A; !) = 0 fW (s;!)2Agds; which is a random measure on (R; B(R)). Le´vy(1948, [40]) proved that for almost all !, for any t ≥ 0, µt is absolutely continuous with respect to the Lebesgue measure, so the Radon-Nikodym derivative Lt;! exists and is Lebesgue-almost everywhere unique: Z µt(A; !) = Lt;!(x)dx: A 2 Trotter (1958, [61]) proved that for almost all ! 2 Ω, there exists a function L(t; x; !) that is continuous in (t; x) 2 [0; 1) × R, such that Z µt(A; !) = L(t; x; !)dx: A From now on, we use L(t; x), the jointed continuous version of the Brownian local times and we make the following remarks on the notations: 1. fL(t; x)gt≥0;x2R is called the Brownian local time. 2. Fix x, fL(t; x)gt≥0 is called the Brownian local time at level x. 3. Fix t, fL(t; x)gx2R is the Brownian local time at time t. 4. Fix x and t, L(t; x) is a random variable. 5. When x = 0, we use L(t) to denote L(t; 0) for short. So L(t; x) can be studied as a function of t or of x. Interesting questions of L(t; x) such as the exact distribution, the limiting distribution as t goes to 1 when x is fixed, the fluctuation of the Brownian local time are well studied. We recall some striking results related to the problems we are to deal with in this dissertation. As a function of t, the distribution of the Brownian local time at the level 0 is given by the following theorem. Theorem 1.0.1 (L´evyidentity, 1948). The processes f(jW (t)j;L(t; 0)) : t ≥ 0g and f(M(t) − W (t);M(t)) : t ≥ 0g have the same distribution, where M(t) = maxfW (s): s 2 [0; t]g. L´evy(1948) proved the theorem by showing that fM(t) − B(t): t ≥ 0g is a reflected Brownian motion. In [46], the theorem is proved by first defining the Brownian local time through the number of downcrossings of a Brownian motion and then using the embedded random walks into Brownian motions. The local time from this definition can be proved to be the density of the occupation measure. The importance of the concept Brownian local time also lies in its deep con- nection with It^o'sformula. 3 Theorem 1.0.2 (Tanaka's formula). Z t + 1 W (t) = 1fW (s)>0gdW (s) + L(t) 0 2 and Z t jW (t)j = sgn(W (s))dW (s) + L(t); 0 where sgn denotes the sign function 8 <+1; x > 0 sgn(x) = :−1; x ≤ 0: It is still true when W (t) is replaced by a continuous semimartingale. Tanaka's formula is the explicit Doob-Meyer decomposition of the submartingale jW (t)j into the martingale part and a continuous increasing process (local time). Tanaka's formula can be generalized by Ito-Tanaka's formula, which is also an extension of It^o'sformula to convex functions. Theorem 1.0.3 (It^o-Tanaka's formula). If f is the difference of two convex func- tions, then Z t Z 0 1 00 f(W (t)) = f(W (0)) + f−(W (s))dW (s) + L(t; x)f (dx): 0 2 R Recall that if f is convex, its second derivative f 00 in the sense of distributions is a positive measure. It^o-Tanaka's formula holds for any continuous semimartingale.
Recommended publications
  • Happy 80Th, Ulf!
    From SIAM News, Volume 37, Number 7, September 2004 Happy 80th, Ulf! My Grenander Number is two, a distinction that goes all the way back to 1958. In 1951, I wrote a paper on an aspect of the Bergman kernel function with Henry O. Pollak (who later became head of mathematics at the Bell Telephone Laboratories). In 1958, Ulf Grenander, Henry Pollak, and David Slepian wrote on the distribution of quadratic forms in normal varieties. Through my friendship with Henry, I learned of Ulf’s existence, and when Ulf came to Brown in 1966, I felt that this friendship would be passed on to me. And so it turned out. I had already read and appreciated the second of Ulf’s many books: On Toeplitz Forms and Their Applications (1958), written jointly with Gabor Szegö. Over the years, my wife and I have lived a block away from Ulf and “Pi” Grenander. (Pi = π = 3.14159...) We have known their children and their grandchildren. And I shouldn’t forget their many dogs. We have enjoyed their hospitality on numerous occasions, including graduations, weddings, and Lucia Day (December 13). We have visited with them in their summer home in Vastervik, Sweden. We have sailed with them around the nearby islands in the Baltic. And yet, despite the fact that our offices are separated by only a few feet, despite the fact that I have often pumped him for mathematical information and received mathematical wisdom in return, my Grenander Number has never been reduced to one. On May 7, 2004, about fifty people gathered in Warwick, Rhode Island, for an all-day celebration, Ulf Grenander from breakfast through dinner, in honor of Ulf’s 80th birthday.
    [Show full text]
  • A Pattern Theoretic Approach
    The Modelling of Biological Growth: a Pattern Theoretic Approach Nataliya Portman, Postdoctoral fellow McConnell Brain Imaging Centre Montreal Neurological Institute McGill University June 21, 2011 The Modelling of Biological Growth: a Pattern Theoretic Approach 2/ 47 Dedication This research work is dedicated to my PhD co-supervisor Dr. Ulf Grenander, the founder of Pattern Theory Division of Applied Mathematics, Brown University Providence, Rhode Island, USA. http://www.dam.brown.edu/ptg/ The Modelling of Biological Growth: a Pattern Theoretic Approach 3/ 47 Outline 1 Mathematical foundations of computational anatomy 2 Growth models in computational anatomy 3 A link between anatomical models and the GRID model 4 GRID view of growth on a fine time scale 5 GRID equation of growth on a coarse time scale (macroscopic growth law) 6 Image inference of growth properties of the Drosophila wing disc 7 Summary, concluding remarks and future perspectives 8 Current work The Modelling of Biological Growth: a Pattern Theoretic Approach 4/ 47 Mathematical foundations of computational anatomy Mathematical foundations of computational anatomy Computational anatomy1 focuses on the precise study of the biological variability of brain anatomy. D'Arcy Thompson laid out the vision of this discipline in his treatise \On Growth and Form" 2. In 1917 he wrote \In a very large part of morphology, our essential task lies in the comparison of related forms rather than in the precise definition of each; and the deformation of a complicated figure may be a phenomenon easy of comprehension, though the figure itself may be left unanalyzed and undefined." D'Arcy Thompson introduced the Method of Coordinates to accomplish the process of comparison.
    [Show full text]
  • 2018 Ulf Grenander Prize in Stochastic Theory and Modeling
    AMS Prize Announcements FROM THE AMS SECRETARY 2018 Ulf Grenander Prize in Stochastic Theory and Modeling Judea Pearl was awarded the inaugural Ulf Grenander Prize in Stochastic Theory and Modeling at the 124th Annual Meeting of the AMS in San Diego, California, in January 2018. Citation belief propagation algorithm in Bayesian networks, which The 2018 Grenander Prize in recast the problem of computing posterior distributions Stochastic Theory and Modeling given evidence as a scheme for passing local messages is awarded to Judea Pearl for between network variables. Although exact computations the invention of a model-based through dynamic programming are possible, Pearl recog- approach to probabilistic and nized that in most problems of interest this would not be causal reasoning, for the discov- feasible, and in fact belief propagation turned out to be ery of innovative tools for infer- remarkably effective in many applications. ring these models from observa- Pearl’s primary goal in adopting Bayesian networks for tions, and for the development formulating structured models of complex systems was of novel computational methods his conviction that Bayesian networks would prove to be Judea Pearl for the practical applications of the right platform for addressing one of the most funda- these models. mental challenges to statistical modeling: the identifica- Grenander sought to develop general tools for con- tion of the conditional independencies among correlated structing realistic models of patterns in natural and variables that are induced by truly causal relationships. man-made systems. He believed in the power of rigorous In a series of papers in the 1990s, Pearl clearly showed mathematics and abstraction for the analysis of complex that statistical and causal notions are distinct and how models, statistical theory for efficient model inference, graphical causal models can provide a formal link between and the importance of computation for bridging theory causal quantities of interest and observed data.
    [Show full text]
  • Norges Teknisk
    ÆÇÊGEË ÌEÃÆÁËùÆAÌÍÊÎÁÌEÆËÃAÈEÄÁGE ÍÆÁÎEÊËÁÌEÌ E×ØiÑaØiÒg BÐÓÓd Îe××eÐ AÖea× iÒ ÍÐØÖa×ÓÙÒd ÁÑage× Í×iÒg a DefÓÖÑabÐe ÌeÑÔÐaØe ÅÓ deÐ bÝ ÇddÚaÖ ÀÙ×bÝ aÒd ÀaÚaÖd ÊÙe ÈÊEÈÊÁÆÌ ËÌAÌÁËÌÁCË ÆǺ ½7»¾¼¼½ ÁËËÆ: ¼8¼4¹9½7¿ ÆÇÊÏEGÁAÆ ÍÆÁÎEÊËÁÌY ÇF ËCÁEÆCE AÆD ÌECÀÆÇÄÇGY ÌÊÇÆDÀEÁŸ ÆÇÊÏAY Ìhi× ÖeÔ ÓÖØ ha× ÍÊÄ hØØÔ:»»ÛÛÛºÑaØhºÒØÒÙºÒÓ»ÔÖeÔÖiÒØ»×ØaØi×Øic×»¾¼¼½»Ë½7¹¾¼¼½ºÔ× ÀaÚaÖd ÊÙe ha× hÓÑeÔage: hØØÔ:»»ÛÛÛºÑaØhºÒØÒÙºÒÓ»hÖÙe E¹ÑaiÐ: hÖÙe@ÑaØhºÒØÒÙºÒÓ AddÖe××: DeÔaÖØÑeÒØ Óf ÅaØheÑaØicaÐ ËcieÒce׸ ÆÓÖÛegiaÒ ÍÒiÚeÖ×iØÝ Óf ËcieÒce aÒd ÌechÒÓÐÓgݸ ƹ749½ ÌÖÓÒdheiѸ ÆÓÖÛaݺ Estimating blood vessel areas in ultrasound images using a deformable template model Oddvar HUSBY and Havard˚ RUE Department of Mathematical Sciences Norwegian University of Science and Technology Norway October 9, 2001 Abstract We consider the problem of obtaining interval estimates of vessel areas from ul- trasound images of cross sections through the carotid artery. Robust and automatic estimates of the cross sectional area is of medical interest and of help in diagnosing atherosclerosis, which is caused by plaque deposits in the carotid artery. We approach this problem by using a deformable template to model the blood vessel outline, and use recent developments in ultrasound science to model the likelihood. We demonstrate that by using an explicit model for the outline, we can easily adjust for an important feature in the data: strong edge reflections called specular reflection. The posterior is challenging to explore, and naive standard MCMC algorithms simply converge to slowly. To obtain an efficient MCMC algorithm we make extensive use of computational efficient Gaus- sian Markov Random Fields, and use various block-sampling constructions that jointly update large parts of the model.
    [Show full text]
  • Philip Ernst: Tweedie Award the Institute of Mathematical Statistics CONTENTS Has Selected Philip A
    Volume 47 • Issue 3 IMS Bulletin April/May 2018 Philip Ernst: Tweedie Award The Institute of Mathematical Statistics CONTENTS has selected Philip A. Ernst as the winner 1 Tweedie Award winner of this year’s Tweedie New Researcher Award. Dr. Ernst received his PhD in 2–3 Members’ news: Peter Bühlmann, Peng Ding, Peter 2014 from the Wharton School of the Diggle, Jun Liu, Larry Brown, University of Pennsylvania and is now Judea Pearl an Assistant Professor of Statistics at Rice University: http://www.stat.rice. 4 Medallion Lecture previews: Jean Bertoin, Davar edu/~pe6/. Philip’s research interests Khoshnevisan, Ming Yuan include exact distribution theory, stochas- tic control, optimal stopping, mathemat- 6 Recent papers: Stochastic Systems; Probability Surveys ical finance and statistical inference for stochastic processes. Journal News: Statistics 7 The IMS Travel Awards Committee Surveys; possible new Data selected Philip “for his fundamental Science journal? Philip Ernst contributions to exact distribution theory, 8 New Researcher Travel in particular for his elegant resolution of the Yule’s nonsense correlation problem, and Awards; Student Puzzle 20 for his development of novel stochastic control techniques for computing the value of 9 Obituaries: Walter insider information in mathematical finance problems.” Rosenkrantz, Herbert Heyer, Philip Ernst will present the Tweedie New Researcher Invited Lecture at the IMS Jørgen Hoffmann-Jørgensen, New Researchers Conference, held this year at Simon Fraser University from July James Thompson,
    [Show full text]
  • Pattern Theory and Its Applications
    Pattern Theory and Its Applications Rudolf Kulhavý Institute of Information Theory and Automation Academy of Sciences of the Czech Republic, Prague Ulf Grenander A Swedish mathematician, since 1966 with Division of Applied Mathematics at Brown University Highly influential research in time series analysis, probability on algebraic structures, pattern recognition, and image analysis. The founder of Pattern Theory 1976, 1978, 1981: Lectures in Pattern Theory 1993: General Pattern Theory 1996: Elements of Pattern Theory 2007: Pattern Theory 2 See initial chapters for good introduction Further referred to as [PT07] Ulf Grenander Michael Miller 3 Agenda 1. Intuitive concepts 2. General pattern theory 3. Illustrative applications 4. Takeaway points 4 Why should one pay attention? Pattern Theory is a mathematical representation of objects with large and incompressible complexity in terms of atom-like blocks and bonds between them, similar to chemical structures.1 Pattern Theory gives an algebraic framework for describing patterns as structures regulated by rules, both local and global. Probability measures are sometimes superimposed on the image algebras to account for variability of the patterns.2 1 Yuri Tarnopolsky, Molecules and Thoughts, 2003 2 Ulf Grenander and Michael I. Miller, Representations of knowledge in complex systems, 1994. 5 Board game analogy Game board 6 Board game analogy Game board Game stones 7 Board game analogy Game board Game stones Game situations 8 Board game analogy Game board Game stones Game situations Game rules Which situations - are allowed? - are likely? 9 Sample "games" Molecular geometry Chemical structures Biological shapes and anatomies Pixelated images and digital videos Visual scene representation Formal languages and grammars Economies and markets Human organizations Social networks Protein folding Historic events Thoughts, ideas, theories … 10 Agenda 1.
    [Show full text]
  • Ulf Grenander
    ULF GRENANDER L. Herbert Ballou University Professor Division of Applied Mathematics Brown University Providence, RI 02912 DEGREES Fil. Lic., University of Stockholm, 1948 Fil. Dr., University of Stockholm, 1950 PROFESSIONAL APPOINTMENTS Visiting Assistant Professor of Statistics, University of Chicago, 1951-52 Visiting Associate Professor of Statistics, University of California, Berkeley, 1952-53 Docent, University of Stockholm, Sweden, 1953-57 Professor of Probability and Statistics, Brown University, 1957-58 Professor and Director, Institute for Insurance Mathematics and Mathematical Statistics, University of Stockholm, 1958-66 Scientific Director, Swedish Institute of Applied Mathematics, 1971-73 Member of the Scientific Board, 1974-1984 L. Herbert Ballou University Professor, Division of Applied Mathematics, Brown University, 1966-present Member, Institut Mittag-Leffler, Stockholm, Sweden, 1979-80 COMMITTEES Executive Committee of Neural Sciences Center and Center for Cognitive Sciences. Inaugurated a new course on insurance mathematics. 1 University course on the influence of technology on mathematics; senior seminar on pattern theory; special topic graduate course on inference in stochastic processes. Arrangements Committee, International Congress of Mathematics, Helsinki, 1978. Advisory Committee on Information Science and Technology, National Science Foundation. Inaugurated a new course on time series analysis. Inaugurated a new course on applied statistics. Chairman, Committee on Pattern Research. Member, Committee on Stochastic
    [Show full text]
  • Downloads Over 8,000)
    Volume 46 • Issue 3 IMS Bulletin April/May 2017 2017 Tweedie Award winner CONTENTS The Institute of Mathematical Statistics 1 Tweedie Award has selected Rina Foygel Barber as the winner of this year’s Tweedie New 2 Members’ News: Danny Pfeffermann, Peter Researcher Award. Guttorp, Rajen Shah, Anton Rina is an Assistant Professor in the Wakolbinger Department of Statistics at the University of Chicago, since January 2014. In 3 Members’ News: ISI Elected members’ IMS elections; 2012–2013 she was an NSF postdoctoral Photo quiz fellow in the Department of Statistics at Stanford University, supervised by 4 Obituaries: Stephen E. Fienberg, Ulf Grenander Emmanuel Candès. Before her postdoc, she received her PhD in Statistics at the Pro Bono Statistics: Yoram Rina Barber 7 University of Chicago in 2012, advised Gat’s new column by Mathias Drton and Nati Srebro, and a MS in Mathematics at the University of 9 Recent papers: Stochastic Chicago in 2009. Prior to graduate school, she was a mathematics teacher at the Park Systems, Probability Surveys School of Baltimore from 2005 to 2007. 10 Donors to IMS Funds Rina lists her research interests as: high-dimensional inference, sparse and low-rank models, and nonconvex optimization, particularly optimization problems arising in 12 Meeting report medical imaging. Her homepage is https://www.stat.uchicago.edu/~rina/. 13 Terence’s Stuff: It Exists! The IMS Travel Awards Committee selected Rina “for groundbreaking contribu- 14 Meetings tions in high-dimensional statistics, including the identifiability of graphical models, low-rank matrix estimation, and false discovery rate theory. A special mention is made 22 Employment Opportunities for her role in the development of the knockoff filter for controlled variable selection.” Rina will present the Tweedie New Researcher Invited Lecture at the IMS New 23 International Calendar Researchers Conference, held this year at the Johns Hopkins University from July 27 Information for Advertisers 27–29 (see http://groups.imstat.org/newresearchers/conferences/nrc.html).
    [Show full text]
  • Theory: Analysis by Synthesis
    > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Analysis by synthesis Marcel Lüthi, 03. Juni 2019 Graphics and Vision Research Group Department of Mathematics and Computer Science University of Basel University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Analysis by synthesis Comparison Parameters 휃 Update 휃 Synthesis 휑(휃) Being able to synthesize data means we can understand how it was formed. − Allows reasoning about unseen parts. University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Analysis by synthesis Comparison Parameters 휃 Update 휃 Synthesis 휑(휃) University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Analysis by synthesis – Computer vision Comparison Parameters 휃 Update 휃 Synthesis 휑(휃) Computer graphics University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Mathematical Framework: Bayesian inference Comparison: 푝 data 휃) Prior 푝(휃) Parameters 휃 Update using 푝(휃|data) Synthesis 휑(휃) • Principled way of dealing with uncertainty. University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Algorithmic implementation: MCMC Comparison: 푝 data 휃) Prior 푝(휃) Parameters 휃 Sample from 푝(휃|data) Synthesis 휑(휃) Posterior distribution over parameters University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE The course in context Pattern Theory Text Music Ulf Grenander Computational Research at anatomy Gravis Medical Images Fotos Speech Natural language This course University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Pattern theory – The mathematics University of Basel > DEPARTMENT OF MATHEMATICS AND COMPUTER SCIENCE Pattern theory vs PMM • Pattern theory is about developing a theory for understanding real- world signals • Probabilistic Morphable Models are about using theoretical well founded concepts to analyse images.
    [Show full text]
  • Long-Run Risk Is the Worst-Case Scenario
    FEDERAL RESERVE BANK OF SAN FRANCISCO WORKING PAPER SERIES Long-Run Risk is the Worst-Case Scenario Rhys Bidder Federal Reserve Bank of San Francisco Ian Dew-Becker Northwestern University May 2016 Working Paper 2014-16 http://www.frbsf.org/economic-research/publications/working-papers/wp2014-16.pdf Suggested citation: Bidder, Rhys, Ian Dew-Becker. 2016. “Long-Run Risk is the Worst-Case Scenario.” Federal Reserve Bank of San Francisco Working Paper 2014-16. http://www.frbsf.org/economic-research/publications/working-papers/wp2014-16.pdf The views in this paper are solely the responsibility of the authors and should not be interpreted as reflecting the views of the Federal Reserve Bank of San Francisco or the Board of Governors of the Federal Reserve System. Long-Run Risk is the Worst-Case Scenario Rhys Bidder and Ian Dew-Becker May 4, 2016 Abstract We study an investor who is unsure of the dynamics of the economy. Not only are pa- rameters unknown, but the investor does not even know what order model to estimate. She estimates her consumption process nonparametrically – allowing potentially infinite-order dy- namics – and prices assets using a pessimistic model that minimizes lifetime utility subject to a constraint on statistical plausibility. The equilibrium is exactly solvable and we show that the pricing model always includes long-run risks. With risk aversion of 4.7, the model matches major facts about asset prices, consumption, and dividends. The paper provides a novel link between ambiguity aversion and non-parametric estimation. Bidder: Federal Reserve Bank of San Francisco. Dew-Becker: Northwestern University.
    [Show full text]
  • THEODORE W. ANDERSON Professor of Statistics and of Economics, Emeritus
    THEODORE W. ANDERSON Professor of Statistics and of Economics, Emeritus University Address Department of Statistics Born: June 5, 1918, Minneapolis, Minnesota Sequoia Hall Phone: (650) 723-4732 390 Serra Mall Fax: (650) 725-8977 Stanford University Email: [email protected] Stanford, CA 94305-4065 Web: statistics.stanford.edu\~twa Education 1937 North Park College A.A. (Valedictorian) 1939 Northwestern University B.S. with Highest Distinction, Mathematics 1942 Princeton University M.A., Mathematics 1945 Princeton University Ph.D., Mathematics Professional Experience 1939{1940 Assistant in Mathematics, Northwestern University 1941{1943 Instructor in Mathematics, Princeton University 1943{1945 Research Associate, National Defense Research Committee, Princeton University 1945{1946 Research Associate, Cowles Commission for Research in Economics, University of Chicago 1946{1947 Instructor in Mathematical Statistics, Columbia University 1947{1950 Assistant Professor of Mathematical Statistics, Columbia University 1950{1951, 1963 Acting Chairman, Department of Mathematical Statistics, Columbia University 1950{1956 Associate Professor of Mathematical Statistics, Columbia University 1956{1967 Professor of Mathematical Statistics, Columbia University 1956{1960, 1964{1965 Chairman, Department of Mathematical Statistics, Columbia University 1967{1988 Professor of Statistics and of Economics, Stanford University 1988{ Professor Emeritus of Statistics and of Economics, Stanford University Professional Activities 1947{1948 Guggenheim Fellow, University
    [Show full text]
  • A Study of High-Dimensional Clustering and Statistical Inference of Networks
    A Study of High-dimensional Clustering and Statistical Inference of Networks by Sharmodeep Bhattacharyya A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Statistics and the Designated Emphasis in Communication, Computation and Statistics in the Graduate Division of the University of California, Berkeley Committee in charge: Professor Peter J. Bickel, Chair Professor Bin Yu Professor Jitendra Malik Fall 2013 A Study of High-dimensional Clustering and Statistical Inference of Networks Copyright 2013 by Sharmodeep Bhattacharyya 1 Abstract A Study of High-dimensional Clustering and Statistical Inference of Networks by Sharmodeep Bhattacharyya Doctor of Philosophy in Statistics and the Designated Emphasis in Communication, Computation and Statistics University of California, Berkeley Professor Peter J. Bickel, Chair Clustering is an important unsupervised classification technique. In supervised classifica- tion, we are provided with a collection of labeled (pre-classified) patterns and the problem is to label a newly encountered, yet unlabeled, pattern. Typically, the given labeled (training) patterns are used to learn the descriptions of classes, which in turn are used to label a new pattern. In clustering, a set of unlabeled patterns are grouped into clusters in such a way that patterns in the same cluster are similar in some sense and patterns in different clusters are dissimilar in the same sense. In a sense, labels are associated with clusters also, but these category labels are data driven; that is, they are obtained solely from the data. The patterns which are to be classified in clustering can come from different sources, they can be vectors in a multi- dimensional space or nodes in discrete spaces.
    [Show full text]