LECTURE NOTES: RANDOM GRAPHS AND PERCOLATION THEORY
MATTHEW KAHLE
Introduction
These notes are based on a graduate course “Random graphs and percolation theory” at Ohio State University in Fall 2012. The notes were typed by several participants in the course. In particular, many thanks go to: Huseyin Acan, Charles Baker, Chris Kennedy, Jung Eun Kim, Yoora Kim, Greg Malen, Fatih Olmez, Kyle Parsons, Dan Poole, and Younghwan Son. Disclaimer: we have not had a chance to carefully edit all these notes yet, so please proceed with caution!
Date: January 1, 2013. 1 2 MATTHEW KAHLE
1. (Wednesday, August 22)
This course will be about various types of random graphs. In today’s class, we partly motivate this study with a discussion of “the probabilistic method”. This method is essential in many areas of mathematics now, but Ramsey theory is an important classical one. The function R(s, t) is defined to be the smallest number N such that for every coloring of the edges of the complete graph Kn with red and blue, there is either a red Ks or a blue Kt subgraph. For example R(3, 3) = 6. For large s and t there is little hope for an exact formula for R(s, t), but we might at least hope to understand the asymptotics. An easy induction argument shows that R(s, t) ≤ R(s − 1, t) + R(s, t − 1). To- gether with the fact that R(s, 1) = R(1, t) = 1, this gives by induction that s + t R(s, t) ≤ . s Exercise 1.1 (easy). Show that this implies that R(s, s) ≤ 4s. Better yet, show √ that R(s, s) = O (4s/ s).
Exercise 1.2 (open). Improve the base of the exponent 4, or else show that it can not be improved. E.g. prove or disprove that R(s, s) = O (3.999s).
The best upper bound I know of is due to David Conlon [4].
Theorem 1.3 (Conlon, 2009). There exists a contain c such that 2s R(s + 1, s + 1) ≤ s−c log s/ log log s . s For lower bounds we apply the probabilistic method. Taking a random coloring of the edges gives the following bound.
Theorem 1.4 (Erd˝os). If n 1− k 2 (2) < 1 k then R(k, k) > n.
The proof is a simple union bound.
Exercise 1.5. Show as a corollary that √ s R(s, s) ≥ 2
The proof is easy but the technique is still powerful. In fact the following is wide open.
Exercise 1.6 (open). Give an explicit description of any sequence of graphs which gives an exponential lower bound on R(s, s). LECTURE NOTES: RANDOM GRAPHS AND PERCOLATION THEORY 3
This can be rephrased as follows.
∞ Exercise 1.7 (open). Give an explicit infinite sequence of graphs {Gn}n=1, so that
Gn has n vertices and such that the largest clique and the largest independent set in Gn are both of order O(log n).
It seems strange that although asymptotically almost every graph according to the measure G(n, 1/2) has this property, 1 it is hard to exhibit any explicit examples. This is sometimes referred to as the problem of finding hay in a haystack. √ Exercise 1.8 (open). Improve the base of the exponent 2, or else show that it can not be improved. E.g. prove or disprove that
R(s, s) = Ω (1.415s) .
The main source for this lecture was Chapter 1 of Alon and Spencer’s book [1].
1In the next lecture we will define the binomial random graph G(n, p). 4 MATTHEW KAHLE
2. (Friday, August 24)
Definition 2.1. For a natural number n and 0 ≤ p ≤ 1, an Erd˝os-R´enyigraph G(n, p) is defined to be a graph on n vertices where each pair of vertices is joined by an edge with probability p, with the edges chosen jointly and independently.
Note that if G is any particular graph on n vertices, then the probability of n −e obtaining G is peG (1 − p)(2) G .