Modern Discrete Probability an Essential Toolkit

Total Page:16

File Type:pdf, Size:1020Kb

Modern Discrete Probability an Essential Toolkit Modern Discrete Probability An Essential Toolkit Sebastien´ Roch November 20, 2014 Modern Discrete Probability: An Essential Toolkit Manuscript version of November 20, 2014 Copyright c 2014 Sebastien´ Roch Contents Preface vi I Introduction 1 1 Models 2 1.1 Preliminaries ............................ 2 1.1.1 Review of graph theory ................... 2 1.1.2 Review of Markov chain theory .............. 2 1.2 Percolation ............................. 2 1.3 Random graphs ........................... 2 1.4 Markov random fields ........................ 2 1.5 Random walks on networks ..................... 3 1.6 Interacting particles on finite graphs ................ 3 1.7 Other random structures ...................... 3 II Basic tools: moments, martingales, couplings 4 2 Moments and tails 5 2.1 First moment method ........................ 6 2.1.1 The probabilistic method .................. 6 2.1.2 Markov’s inequality .................... 8 2.1.3 . Random permutations: longest increasing subsequence . 9 2.1.4 . Constraint satisfaction: bound on random k-SAT threshold 11 2.1.5 . Percolation on Zd: existence of a phase transition .... 12 2.2 Second moment method ...................... 16 2.2.1 Chebyshev’s and Paley-Zygmund inequalities ....... 17 2.2.2 . Erdos-R¨ enyi´ graphs: small subgraph containment .... 21 2.2.3 . Erdos-R¨ enyi´ graphs: connectivity threshold ....... 25 i 2.2.4 . Percolation on trees: branching number, weighted sec- ond moment method, and application to the critical value . 28 2.3 Chernoff-Cramer´ method ...................... 32 2.3.1 Tail bounds via the moment-generating function ..... 33 2.3.2 . Randomized algorithms: Johnson-Lindenstrauss, "-nets, and application to compressed sensing .......... 37 2.3.3 . Information theory: Shannon’s theorem ......... 44 2.3.4 . Markov chains: Varopoulos-Carne, and a bound on mixing 44 2.3.5 Hoeffding’s and Bennett’s inequalities ........... 50 2.3.6 . Knapsack: probabilistic analysis ............. 53 2.4 Matrix tail bounds .......................... 53 2.4.1 Ahlswede-Winter inequality ................ 54 2.4.2 . Randomized algorithms: low-rank approximations ... 54 3 Stopping times, martingales, and harmonic functions 58 3.1 Background ............................. 58 3.1.1 Stopping times ....................... 58 3.1.2 . Markov chains: exponential tail of hitting times, and some cover time bounds .................. 58 3.1.3 Martingales ......................... 58 3.1.4 . Percolation on trees: critical regime ........... 58 3.2 Concentration for martingales ................... 59 3.2.1 Azuma-Hoeffding inequality ................ 59 3.2.2 Method of bounded differences .............. 60 3.2.3 . Erdos-R¨ enyi´ graphs: exposure martingales, and applica- tion to the chromatic number ................ 64 3.2.4 . Hypercube: concentration of measure .......... 68 3.2.5 . Preferential attachment graphs: degree sequence .... 71 3.3 Electrical networks ......................... 78 3.3.1 Martingales and the Dirichlet problem ........... 79 3.3.2 Basic electrical network theory ............... 82 3.3.3 Bounding the effective resistance ............. 91 3.3.4 . Random walk in a random environment: supercritical percolation clusters ..................... 106 3.3.5 . Uniform spanning trees: Wilson’s method ........ 106 3.3.6 . Ising model on trees: the reconstruction problem .... 113 4 Coupling 118 4.1 Background ............................. 118 4.1.1 . Erdos-R¨ enyi´ graphs: degree sequence .......... 118 ii 4.1.2 . Harmonic functions: lattices and trees .......... 118 4.2 Stochastic domination and correlation inequalities ......... 118 4.2.1 Definitions ......................... 119 4.2.2 . Ising model on Zd: extremal measures .......... 128 4.2.3 . Random walk on trees: speed .............. 131 4.2.4 FKG and Holley’s inequalities ............... 132 4.2.5 . Erdos-R¨ enyi´ graphs: Janson’s inequality, and applica- tion to the containment problem .............. 137 4.2.6 . Percolation on Z2: RSW theory, and a proof of Harris’ theorem ........................... 140 4.3 Couplings of Markov chains .................... 147 4.3.1 Bounding the mixing time via coupling .......... 148 4.3.2 . Markov chains: mixing on cycles, hypercubes, and trees 149 4.3.3 Path coupling ........................ 154 4.3.4 . Ising model: Glauber dynamics at high temperature .. 158 4.3.5 . Colorings: from approximate sampling to approximate counting .......................... 160 4.4 Duality ............................... 161 4.4.1 Graphical representations and coupling of initial configu- rations ............................ 161 4.4.2 . Interacting particles: voter model on the complete graph and on the line ....................... 161 III Further techniques 165 5 Branching processes 166 5.1 Background ............................. 166 5.1.1 . Random walk in a random environment: Galton-Watson trees ............................. 166 5.2 Comparison to branching processes ................ 166 5.2.1 . Percolation on trees: branching process approach ... 166 5.2.2 . Randomized algorithms: height of random binary search tree ............................. 170 5.2.3 . Erdos-R¨ enyi´ graphs: the phase transition ........ 170 5.3 Further applications ......................... 187 5.3.1 . Random trees: local limit of uniform random trees ... 187 iii 6 Spectral techniques 190 6.1 Bounding the mixing time via the spectral gap ........... 191 6.1.1 . Markov chains: random walk on cycles and hypercubes 191 6.2 Eigenvalues and isoperimetry .................... 191 6.2.1 Edge and vertex expansion ................. 191 6.2.2 . Percolation on Zd: uniqueness of the infinite cluster, and extension to transitive amenable graphs .......... 191 6.2.3 . Random regular graphs: existence of expander graphs . 191 6.2.4 Bounding the spectral gap via the expansion constant ... 191 6.2.5 . Markov chains: random walk on cycles, hypercubes, and trees revisited ........................ 191 6.2.6 . Ising model: Glauber dynamics on the complete graph and expander graphs .................... 191 6.3 Further applications ......................... 191 6.3.1 . Markov random fields: more on the reconstruction problem191 6.3.2 . Stochastic blockmodel: spectral partitioning ...... 191 7 Correlation 192 7.1 Correlation decay .......................... 193 7.1.1 Definitions ......................... 193 7.1.2 . Percolation on Zd: BK inequality, and exponential decay of the correlation length .................. 193 7.1.3 . Ising model on Z2: uniqueness .............. 193 7.1.4 . Ising model on trees: non-reconstruction ........ 193 7.1.5 Weitz’s computation tree .................. 193 7.2 Local dependence .......................... 193 7.2.1 Lovasz´ local lemma .................... 193 7.2.2 Chen-Stein method ..................... 193 7.2.3 . Occupancy problems: birthday paradox, and variants . 193 7.3 Poissonization ............................ 193 7.3.1 . Occupancy problems: species sampling ......... 193 7.4 Exchangeability ........................... 193 7.4.1 . Preferential attachment graphs: connectivity via Polya´ urns ............................. 193 7.4.2 . Random partitions: size-biasing and stick-breaking, and application to cycles of random permutations ....... 193 8 More on concentration and isoperimetry 194 8.1 Variance bounds ........................... 195 8.1.1 Efron-Stein and Poincare´ inequalities ........... 195 iv 8.1.2 . Random permutations: variance of longest increasing subsequence ........................ 195 8.1.3 Canonical paths and comparison .............. 195 8.2 Talagrand’s inequality ........................ 195 8.2.1 Proof via a log Sobolev inequality ............. 195 8.2.2 . Random permutations: concentration of longest increas- ing subsequence ...................... 195 8.2.3 . Random structures: stochastic traveling salesman .... 195 8.3 Influence .............................. 195 8.3.1 Definitions ......................... 195 8.3.2 Russo’s formula ...................... 195 8.3.3 . Ising model on trees: recursive majority ......... 195 8.3.4 . Percolation on Z2: Kesten’s theorem ........... 195 8.3.5 Hypercontractivity ..................... 195 8.3.6 . First-passage percolation: anomalous fluctuations ... 195 8.4 Beyond the union bound ...................... 195 8.4.1 "-nets and chaining ..................... 195 8.4.2 VC-dimension ....................... 195 v Preface These lecture notes form the basis for a one-semester introduction to modern dis- crete probability, with an emphasis on essential techniques used throughout the field. The material is borrowed mostly from the following excellent texts. (Consult the bibliography for complete citations.) [AF] David Aldous and James Allen Fill. Reversible Markov chains and random walks on graphs. [AS11] N. Alon and J.H. Spencer. The Probabilistic Method. [BLM13] S. Boucheron, G. Lugosi, and P. Massart. Concentration Inequalities: A Nonasymptotic Theory of Independence. [Gri10b] G.R. Grimmett. Percolation. [JLR11] S. Janson, T. Luczak, and A. Rucinski. Random Graphs. [LP] R. Lyons with Y. Peres. Probability on trees and networks. [LPW06] David A. Levin, Yuval Peres, and Elizabeth L. Wilmer. Markov chains and mixing times. [MU05] Michael Mitzenmacher and Eli Upfal. Probability and Computing: Ran- domized Algorithms and Probabilistic Analysis. [RAS] Firas Rassoul-Agha and Timo Seppal¨ ainen.¨ A course on large deviations with an introduction to Gibbs measures. [Ste] J. E. Steif. A mini course on percolation theory. [vdH10] Remco van der Hofstad. Percolation and random graphs.
Recommended publications
  • Information Theory
    Information Theory Gurkirat,Harsha,Parth,Puneet,Rahul,Tushant November 8, 2015 1 Introduction[1] Information theory is a branch of applied mathematics, electrical engineering, and computer science involving the quantification of information . Firstly, it provides a general way to deal with any information , be it language, music, binary codes or pictures by introducing a measure of information , en- tropy as the average number of bits(binary for our convenience) needed to store one symbol in a message. This abstraction of information through entropy allows us to deal with all kinds of information mathematically and apply the same principles without worrying about it's type or form! It is remarkable to note that the concept of entropy was introduced by Ludwig Boltzmann in 1896 as a measure of disorder in a thermodynamic system , but serendipitously found it's way to become a basic concept in Information theory introduced by Shannon in 1948 . Information theory was developed as a way to find fundamental limits on the transfer , storage and compression of data and has found it's way into many applications including Data Compression and Channel Coding. 2 Basic definitions 2.1 Intuition behind "surprise" or "information"[2] Suppose someone tells you that the sun will come up tomorrow.This probably isn't surprising. You already knew the sun will come up tomorrow, so they didn't't really give you much information. Now suppose someone tells you that aliens have just landed on earth and have picked you to be the spokesperson for the human race. This is probably pretty surprising, and they've given you a lot of information.The rarer something is, the more you've learned if you discover that it happened.
    [Show full text]
  • Perfect Filtering and Double Disjointness
    ANNALES DE L’I. H. P., SECTION B HILLEL FURSTENBERG YUVAL PERES BENJAMIN WEISS Perfect filtering and double disjointness Annales de l’I. H. P., section B, tome 31, no 3 (1995), p. 453-465 <http://www.numdam.org/item?id=AIHPB_1995__31_3_453_0> © Gauthier-Villars, 1995, tous droits réservés. L’accès aux archives de la revue « Annales de l’I. H. P., section B » (http://www.elsevier.com/locate/anihpb) implique l’accord avec les condi- tions générales d’utilisation (http://www.numdam.org/conditions). Toute uti- lisation commerciale ou impression systématique est constitutive d’une infraction pénale. Toute copie ou impression de ce fichier doit conte- nir la présente mention de copyright. Article numérisé dans le cadre du programme Numérisation de documents anciens mathématiques http://www.numdam.org/ Ann. Inst. Henri Poincaré, Vol. 31, n° 3, 1995, p. 453-465. Probabilités et Statistiques Perfect filtering and double disjointness Hillel FURSTENBERG, Yuval PERES (*) and Benjamin WEISS Institute of Mathematics, the Hebrew University, Jerusalem. ABSTRACT. - Suppose a stationary process ~ Un ~ is used to select from several stationary processes, i.e., if Un = i then we observe Yn which is the n’th variable in the i’th process. when can we recover the selecting sequence ~ Un ~ from the output sequence ~Yn ~ ? RÉSUMÉ. - Soit ~ Un ~ un processus stationnaire utilise pour la selection de plusieurs processus stationnaires, c’ est-a-dire si Un = i alors on observe Yn qui est le n-ième variable dans le i-ième processus. Quand peut-on reconstruire ~ Un ~ à partir de ~ Yn ~ ? 1. INTRODUCTION Suppose a discrete-time stationary stochastic signal ~ Un ~, taking integer values, is transmitted over a noisy channel.
    [Show full text]
  • Probabilistic Methods in Combinatorics
    Lecture notes (MIT 18.226, Fall 2020) Probabilistic Methods in Combinatorics Yufei Zhao Massachusetts Institute of Technology [email protected] http://yufeizhao.com/pm/ Contents 1 Introduction7 1.1 Lower bounds to Ramsey numbers......................7 1.1.1 Erdős’ original proof..........................8 1.1.2 Alteration method...........................9 1.1.3 Lovász local lemma........................... 10 1.2 Set systems................................... 11 1.2.1 Sperner’s theorem............................ 11 1.2.2 Bollobás two families theorem..................... 12 1.2.3 Erdős–Ko–Rado theorem on intersecting families........... 13 1.3 2-colorable hypergraphs............................ 13 1.4 List chromatic number of Kn;n ......................... 15 2 Linearity of expectations 17 2.1 Hamiltonian paths in tournaments....................... 17 2.2 Sum-free set................................... 18 2.3 Turán’s theorem and independent sets.................... 18 2.4 Crossing number inequality.......................... 20 2.4.1 Application to incidence geometry................... 22 2.5 Dense packing of spheres in high dimensions................. 23 2.6 Unbalancing lights............................... 26 3 Alterations 28 3.1 Ramsey numbers................................ 28 3.2 Dominating set in graphs............................ 28 3.3 Heilbronn triangle problem........................... 29 3.4 Markov’s inequality............................... 31 3.5 High girth and high chromatic number.................... 31 3.6 Greedy random coloring............................ 32 2 4 Second moment method 34 4.1 Threshold functions for small subgraphs in random graphs......... 36 4.2 Existence of thresholds............................. 42 4.3 Clique number of a random graph....................... 47 4.4 Hardy–Ramanujan theorem on the number of prime divisors........ 49 4.5 Distinct sums.................................. 52 4.6 Weierstrass approximation theorem...................... 54 5 Chernoff bound 56 5.1 Discrepancy..................................
    [Show full text]
  • Russell David Lyons
    Russell David Lyons Education Case Western Reserve University, Cleveland, OH B.A. summa cum laude with departmental honors, May 1979, Mathematics University of Michigan, Ann Arbor, MI Ph.D., August 1983, Mathematics Sumner Myers Award for best thesis in mathematics Specialization: Harmonic Analysis Thesis: A Characterization of Measures Whose Fourier-Stieltjes Transforms Vanish at Infinity Thesis Advisers: Hugh L. Montgomery, Allen L. Shields Employment Indiana University, Bloomington, IN: James H. Rudy Professor of Mathematics, 2014{present. Indiana University, Bloomington, IN: Adjunct Professor of Statistics, 2006{present. Indiana University, Bloomington, IN: Professor of Mathematics, 1994{2014. Georgia Institute of Technology, Atlanta, GA: Professor of Mathematics, 2000{2003. Indiana University, Bloomington, IN: Associate Professor of Mathematics, 1990{94. Stanford University, Stanford, CA: Assistant Professor of Mathematics, 1985{90. Universit´ede Paris-Sud, Orsay, France: Assistant Associ´e,half-time, 1984{85. Sperry Research Center, Sudbury, MA: Researcher, summers 1976, 1979. Hampshire College Summer Studies in Mathematics, Amherst, MA: Teaching staff, summers 1977, 1978. Visiting Research Positions University of Calif., Berkeley: Visiting Miller Research Professor, Spring 2001. Microsoft Research: Visiting Researcher, Jan.{Mar. 2000, May{June 2004, July 2006, Jan.{June 2007, July 2008{June 2009, Sep.{Dec. 2010, Aug.{Oct. 2011, July{Oct. 2012, May{July 2013, Jun.{Oct. 2014, Jun.{Aug. 2015, Jun.{Aug. 2016, Jun.{Aug. 2017, Jun.{Aug. 2018. Weizmann Institute of Science, Rehovot, Israel: Rosi and Max Varon Visiting Professor, Fall 1997. Institute for Advanced Studies, Hebrew University of Jerusalem, Israel: Winston Fellow, 1996{97. Universit´ede Lyon, France: Visiting Professor, May 1996. University of Wisconsin, Madison, WI: Visiting Associate Professor, Winter 1994.
    [Show full text]
  • Affine Embeddings of Cantor Sets on the Line Arxiv:1607.02849V3
    Affine Embeddings of Cantor Sets on the Line Amir Algom August 5, 2021 Abstract Let s 2 (0; 1), and let F ⊂ R be a self similar set such that 0 < dimH F ≤ s . We prove that there exists δ = δ(s) > 0 such that if F admits an affine embedding into a homogeneous self similar set E and 0 ≤ dimH E − dimH F < δ then (under some mild conditions on E and F ) the contraction ratios of E and F are logarithmically commensurable. This provides more evidence for Conjecture 1.2 of Feng et al.(2014), that states that these contraction ratios are logarithmically commensurable whenever F admits an affine embedding into E (under some mild conditions). Our method is a combination of an argument based on the approach of Feng et al.(2014) with a new result by Hochman(2016), which is related to the increase of entropy of measures under convolutions. 1 Introduction Let A; B ⊂ R. We say that A can be affinely embedded into B if there exists an affine map g : R ! R; g(x) = γ · x + t for γ 6= 0; t 2 R, such that g(A) ⊆ B. The motivating problem of this paper to show that if A and B are two Cantor sets and A can be affinely embedded into B, then the there should be some arithmetic dependence between the scales appearing in the multiscale structures of A and B. This phenomenon is known to occur e.g. when A and B are two self similar sets satisfying the strong separation condition that are Lipschitz equivalent, see Falconer and Marsh(1992).
    [Show full text]
  • Characterization of Cutoff for Reversible Markov Chains
    Characterization of cutoff for reversible Markov chains Riddhipratim Basu ∗ Jonathan Hermon y Yuval Peres z Abstract A sequence of Markov chains is said to exhibit (total variation) cutoff if the conver- gence to stationarity in total variation distance is abrupt. We consider reversible lazy chains. We prove a necessary and sufficient condition for the occurrence of the cutoff phenomena in terms of concentration of hitting time of \worst" (in some sense) sets of stationary measure at least α, for some α (0; 1). 2 We also give general bounds on the total variation distance of a reversible chain at time t in terms of the probability that some \worst" set of stationary measure at least α was not hit by time t. As an application of our techniques we show that a sequence of lazy Markov chains on finite trees exhibits a cutoff iff the product of their spectral gaps and their (lazy) mixing-times tends to . 1 Keywords: Cutoff, mixing-time, finite reversible Markov chains, hitting times, trees, maximal inequality. arXiv:1409.3250v4 [math.PR] 18 Feb 2015 ∗Department of Statistics, UC Berkeley, California, USA. E-mail: [email protected]. Supported by UC Berkeley Graduate Fellowship. yDepartment of Statistics, UC Berkeley, USA. E-mail: [email protected]. zMicrosoft Research, Redmond, Washington, USA. E-mail: [email protected]. 1 1 Introduction In many randomized algorithms, the mixing-time of an underlying Markov chain is the main component of the running-time (see [21]). We obtain a tight bound on tmix() (up to an absolute constant independent of ) for lazy reversible Markov chains in terms of hitting times of large sets (Proposition 1.7,(1.6)).
    [Show full text]
  • 14 the Probabilistic Method
    14 The Probabilistic Method Probabilistic Graph Theory Theorem 14.1 (Szele) For every positive integer n, there exists a tournament on n vertices with at least n!2¡(n¡1) Hamiltonian paths. Proof: Construct a tournament by randomly orienting each edge of Kn in each direction 1 independently with probability 2 . For any permutation σ of the vertices, let Xσ be the indicator random variable which has value 1 if σ is the vertex sequence of a Hamiltonian path, and 0 otherwise. Let X be the random variable which is the total number of Hamiltonain P paths. Then X = Xσ and we have X ¡(n¡1) E(X) = E(Xσ) = n!2 : σ Thus, there must exist at least one tournament on n vertices which has ¸ n!2¡(n¡1) Hamil- tonain paths as claimed. ¤ 1 Theorem 14.2 Every graph G has a bipartite subgraph H for which jE(H)j ¸ 2 jE(G)j. Proof: Choose a random subset S ⊆ V (G) by independently choosing each vertex to be in S 1 with probability 2 . Let H be the subgraph of G containing all of the vertices, and all edges with exactly one end in S. For every edge e, let Xe be the indicator random variable with value 1 if e 2 E(H) and 0 otherwise. If e = uv, then e will be in H if u 2 S and v 62 S or if 1 u 62 S and v 2 S, so E(Xe) = P(e 2 E(H)) = 2 and we ¯nd X 1 E(jE(H)j) = E(Xe) = 2 jE(G)j: e2E(G) 1 Thus, there must exist at least one bipartite subgraph H with jE(H)j ¸ 2 jE(G)j.
    [Show full text]
  • Amber L. Puha, Phd June 2021 Department of Mathematics Craven Hall 6130 California State University San Marcos Office Phone: 1-760-750-4201 333 S
    Amber L. Puha, PhD June 2021 Department of Mathematics Craven Hall 6130 California State University San Marcos Office Phone: 1-760-750-4201 333 S. Twin Oaks Valley Road Department Phone: 1-760-750-8059 San Marcos, CA 92096-0001 https://faculty.csusm.edu/apuha/ [email protected] Appointments 2010{present, Professor, Department of Mathematics, CSU San Marcos 2020-2021, Sabbatical Leave of Absence, Research Associate and Teaching Visitor Department of Mathematics, UCSD 2013-2014, Sabbatical Leave of Absence, Research Associate and Teaching Visitor Department of Mathematics, UCSD 2010-2011, Professional Leave of Absence 2009{2011, Associate Director, Institute for Pure and Applied Mathematics (IPAM), UCLA 2004-2010, Associate Professor, Department of Mathematics, CSU San Marcos 2009-2010, Professional Leave of Absence 2005-2006, Sabbatical Leave of Absence, Research Associate and Teaching Visitor, Department of Mathematics, UCSD 1999-2004, Assistant Professor, Department of Mathematics, CSU San Marcos 2000-2001 & Spring 2002, Professional Leave of Absence, National Science Foundation Mathematical Sciences Postdoctoral Fellow Research Interests Probability Theory and Stochastic Processes, with emphasis on Stochastic Networks Professional Awards 2021-2024, PI, National Science Foundation Single-Investigator Award, DMS-2054505, $232,433 2020{2021, CSUSM Research, Scholarship and Creative Activity (RSCA) Grant, $7,200 2019{2020, CSUSM Research, Scholarship and Creative Activity (RSCA) Grant, $2,000 2019, National Scholastic Surfing Association,
    [Show full text]
  • Front Matter
    Cambridge University Press 978-1-107-13411-9 — Fractals in Probability and Analysis Christopher J. Bishop , Yuval Peres Frontmatter More Information CAMBRIDGE STUDIES IN ADVANCED MATHEMATICS 162 Editorial Board B. BOLLOBAS,´ W. FULTON, F. KIRWAN, P. SARNAK, B. SIMON, B. TOTARO FRACTALS IN PROBABILITY AND ANALYSIS A mathematically rigorous introduction to fractals which emphasizes examples and fundamental ideas. Building up from basic techniques of geometric measure theory and probability, central topics such as Hausdorff dimension, self-similar sets and Brownian motion are introduced, as are more specialized topics, including Kakeya sets, capacity, percolation on trees and the Traveling Salesman Theorem. The broad range of techniques presented enables key ideas to be highlighted, without the distraction of excessive technicalities. The authors incorporate some novel proofs which are simpler than those available elsewhere. Where possible, chapters are designed to be read independently so the book can be used to teach a variety of courses, with the clear structure offering students an accessible route into the topic. Christopher J. Bishop is a professor in the Department of Mathematics at Stony Brook University. He has made contributions to the theory of function algebras, Kleinian groups, harmonic measure, conformal and quasiconformal mapping, holomorphic dynamics and computational geometry. Yuval Peres is a Principal Researcher at Microsoft Research in Redmond, WA. He is particularly known for his research in topics such as fractals and Hausdorff measure, random walks, Brownian motion, percolation and Markov chain mixing times. In 2016 he was elected to the National Academy of Science. © in this web service Cambridge University Press www.cambridge.org Cambridge University Press 978-1-107-13411-9 — Fractals in Probability and Analysis Christopher J.
    [Show full text]
  • 18.218 Probabilistic Method in Combinatorics, Topic 3: Alterations
    Lecturer: Prof. Yufei Zhao Notes by: Andrew Lin 3 Alterations Recall the naive probabilistic method: we found some lower bounds for Ramsey numbers in Section 1.1, primarily for the diagonal numbers. We did this with a basic method: color randomly, so that we color each edge red with probability p and blue with probability 1 − p. Then the probability that we don’t see any red s-cliques or blue t-cliques (with a union bound) is at most n (s ) n (t ) p 2 + (1 − p) 2 ; s t and if this is less than 1 for some p, then there exists some graph on n vertices for which there is no red Ks and blue Kt . So we union bounded the bad events there. Well, the alteration method does a little more than that - here’s a proof that mirrors that of Proposition 1.6. We again color randomly, but the idea now is to delete a vertex in every bad clique (red Ks and blue Kt ). How many edges have we deleted? We can estimate by using linearity of expectation: Theorem 3.1 For all p 2 (0; 1); n 2 N, n (s ) n (t ) R(s; t) > n − p 2 − (1 − p) 2 : s t This right hand side begins by taking the starting number of vertices and then we deleting one vertex for each clique. We’re going to explore this idea of “fixing the blemishes” a little more. 3.1 Dominating sets Definition 3.2 Given a graph G, a dominating set U is a set of vertices such that every vertex not in U has a neighbor in U.
    [Show full text]
  • CHRISTOPHER J. BISHOP Department of Mathematics Stony Brook University Stony Brook, NY 11794-3651
    CHRISTOPHER J. BISHOP Department of Mathematics Stony Brook University Stony Brook, NY 11794-3651 RESEARCH INTERESTS Real and complex analysis, geometric function theory, conformal dynamics, probability theory, numerical analysis, analysis on fractals, quasiconformal geometry, computational geometry. Some of my more particular interests have included: potential theory, fractal properties of harmonic measure, geometric properties of Brownian motion and other random processes, al- gebras generated by harmonic and holomorphic functions, geometry of hyperbolic manifolds and their covering groups, numerical computation of conformal mappings, multipole methods, optimal meshing algorithms, nonobtuse triangulation, quadrilateral meshing, iteration of entire functions and dimension distorting properties of quasiconformal maps. I program in C, Mathe- matica and Matlab as part of my investigations in these areas, and teach a class on experimental aspects of mathematics. PROFESSIONAL EXPERIENCE Sept. 1997 to present: Professor at Stony Brook University. Sept. 1992 to Aug. 1997: Assoc. professor at Stony Brook. Sept. 1991 to Aug. 1992: Asst. professor at Stony Brook. Sept. 1988 to Aug. 1991: Hedrick Asst. professor at UCLA. Sept. 1987 to Aug. 1988: NSF postdoc at MSRI, Berkeley. EDUCATION Cambridge University, Master of Advanced Study (MASt), 2011. University of Chicago, Mathematics, Ph. D., 1987, Advisor Peter W. Jones. Visiting graduate student and programmer, Dept. of Mathematics, Yale University, 1985-1987. University of Chicago, Mathematics, Master of Science, 1984. Cambridge University, Certificate of Advanced Study (Part III of Math. Tripos), 1983. Michigan State University, Mathematics, Bachelor of Science, 1982. PH.D. STUDENTS Zsuzsanna Gonye, Ph. D. 2001, Geodesics in hyperbolic manifolds, Assoc. Professor, University of West Hungary Karyn Lundberg, Ph. D. 2005, Boundary behavior of conformal mappings, researcher, Lincoln Labs Hrant Hakobyan, Ph.D.
    [Show full text]
  • Diffusive Estimates for Random Walks on Stationary Random Graphs Of
    Diffusive estimates for random walks on stationary random graphs of polynomial growth Shirshendu Ganguly ∗ James R. Lee y Yuval Peres z Abstract G Let (G; ρ) be a stationary random graph, and use Bρ (r) to denote the ball of radius r about ρ G k in G. Suppose that (G; ρ) has annealed polynomial growth, in the sense that E[ Bρ (r) ] 6 O(r ) for some k > 0 and every r > 1. j j Then there is an infinite sequence of times tn at which the random walk Xt on (G; ρ) is at most diffusive: Almost surely (over the choicef ofg (G; ρ)), there is a number C f> 0g such that h 2 i E distG(X ; Xt ) X = ρ, (G; ρ) 6 Ctn n > 1 : 0 n j 0 8 This result is new even in the case when G is a stationary random subgraph of Zd. Combined with the work of Benjamini, Duminil-Copin, Kozma, and Yadin (2015), it implies that G almost surely does not admit a non-constant harmonic function of sublinear growth. To complement this, we argue that passing to a subsequence of times tn is necessary, as there are stationary random graphs of (almost sure) polynomial growth wheref theg random walk is almost surely superdiffusive at an infinite subset of times. Contents 1 Introduction 2 1.1 The absence of non-constant sublinear growth harmonic functions..........3 1.2 Speed via Euclidean embeddings..............................4 1.3 A deterministic example: Planar graphs..........................6 2 Martingales, embeddings, and growth rates8 2.1 Control by martingales....................................9 2.2 Embeddings and growth rates...............................
    [Show full text]