
Probability and Algorithms, Caltech CS150, Fall 2018 Leonard J. Schulman Schulman: CS150 2018 Syllabus The most important questions of life are, for the most part, really only problems of probability. Strictly speaking one may even say that nearly all our knowledge is problematical; and in the small number of things which we are able to know with certainty, even in the mathematical sciences themselves, induction and analogy, the principal means for discovering truth, are based on probabilities, so that the entire system of human knowledge is connected with this theory. Pierre-Simon Laplace Introduction to Theorie Analytique des Probabilities. Oeuvres, t. 7. Paris, 1886, p. 5. Class time: MWF10:00-11:00 in Annenberg 314. Office hours will be in my office, Annenberg 317. My office hours are by appointment at the beginning of the term, after that I’ll fix a regular time (but appointments will still be fine). Starting Oct 12: my OH are on Fridays at 11:00. TA: Jenish Mehta, [email protected]. TA Office Hours: weeks that an assignment is due: M 7:00pm Ann 121; off weeks: W 7:00pm Ann 107. (There is a calendar on the class web page that includes all this information.) Problem sets are due to Jenish’s mailbox by W 6:00pm. There will be problem sets, due on Wednesdays; there will not be an exam. You may collaborate with other students on the sets; just make a note of the extent of collaboration and with whom (truly joint work with Alice, Bob helped me on this problem, etc.). This is assuming it’s a collaboration and doesn’t regularly become one-way; if you feel that happening, (a) focus on doing a decent fraction of the problems on your own or with consultation with me or the TA, (b) don’t collaborate until after you’ve already spent some time thinking about the problem yourself. Lecture notes will be posted after the fact. The topics covered during the quarter are listed in the table of contents. Some other topics not reached that I’ll try to cover when I add a second quarter to the course: Ran- domized vs. Distributional Complexity. Game tree evaluation: upper and lower bounds. Karger’s min-cut algorithm. Hashing, AKS dictionary hashing, cuckoo hashing. Power of two choices. Talagrand concentration inequality. Linial-Saks graph partitioning. A. Kalai’s sampling random factored numbers. Feige leader election. Approximation of the permanent and self-reducibility. Equivalence of approximate counting and approximate sampling. #-biased k-wise independent spaces. #DNF-approximation. Shamir secret sharing. An interactive proof for a problem only known to be in coNP: graph non-isomorphism. Searching for the first spot where two sequences disagree. Weighted sampling (e.g., Karger network reliability). Markov Chain Monte Carlo. Notes. (1) This course can be only an exposure to probability and its role in the theory of algorithms. We will stay focused on key ideas and examples; we will not be overconcerned with best bounds. (2) I assume this is not your first exposure to probability. Likewise I’ll assume you have some familiarity with algorithms. However, the first lecture will start out with some basic examples and definitions. Books. There will be no assigned book, but I recommend the following references: On reserve at SFL: • Mitzenmacher & Upfal, Probability and Computing, Cambridge 2005 • Motwani & Raghavan, Randomized Algorithms, Cambridge 1995 • Williams, Probability with Martingales, Cambridge 1991 • Alon & Spencer, The Probabilistic Method, 4th ed., Wiley 2016 Not on reserve: 1 Schulman: CS150 2018 • Adams & Guillemin, Measure Theory and Probability, Birkhauser¨ 1996 • Billingsley, Probability and Measure, 3rd ed., Wiley 1995 2 Contents 1 Some basic probability theory 7 1.1 Lecture 1 (3/Oct): Appetizers . .7 1.2 Lecture 2 (5/Oct) Some basics . .9 1.2.1 Measure . .9 1.2.2 Measurable functions, random variables and events . 10 1.3 Lecture 3 (8/Oct): Linearity of expectation, union bound, existence theorems . 13 1.3.1 Countable additivity . 13 1.3.2 Coupon collector . 13 1.3.3 Application: the probabilistic method . 14 1.3.4 Union bound . 14 1.3.5 Using the union bound in the probabilistic method: Ramsey theory . 15 1.4 Lecture 4 (10/Oct): Upper and lower bounds . 17 1.4.1 Bonferroni inequalities . 17 1.4.2 Tail events: Borel-Cantelli . 18 1.4.3 B-C II: a partial converse to B-C I . 19 1.5 Lecture 5 (12/Oct): More on tail events: Kolmogorov 0-1, random walk . 20 1.6 Lecture 6 (15/Oct): More probabilistic method . 23 1.6.1 Markov inequality (the simplest tail bound) . 23 1.6.2 Variance and the Chebyshev inequality: a second tail bound . 23 1.6.3 Power mean inequality . 23 1.6.4 Large girth and large chromatic number; the deletion method . 24 1.7 Lecture 7 (17/Oct): FKG inequality . 26 1.8 Lecture 8 (19/Oct) Part I: Achieving expectation in MAX-3SAT. 29 1.8.1 Another appetizer . 29 1.8.2 MAX-3SAT . 29 1.8.3 Derandomization by the method of conditional expectations . 29 3 Schulman: CS150 2018 CONTENTS 2 Algebraic Fingerprinting 32 2.1 Lecture 8 (19/Oct) Part II: Fingerprinting with Linear Algebra . 32 2.1.1 Polytime Complexity Classes Allowing Randomization . 32 2.1.2 Verifying Matrix Multiplication . 33 2.2 Lecture 9 (22/Oct): Fingerprinting with Linear Algebra . 35 2.2.1 Verifying Associativity . 35 2.3 Lecture 10 (24/Oct): Perfect matchings, polynomial identity testing . 37 2.3.1 Matchings . 37 2.3.2 Bipartite perfect matching: deciding existence . 37 2.3.3 Polynomial identity testing . 39 2.4 Lecture 11 (26/Oct): Perfect matchings in general graphs. Parallel computation. Iso- lating lemma. 41 2.4.1 Deciding existence of a perfect matching in a graph . 41 2.4.2 Parallel computation . 42 2.4.3 Sequential and parallel linear algebra . 42 2.4.4 Finding perfect matchings in general graphs. The isolating lemma . 43 2.5 Lecture 12 (29/Oct.): Isolating lemma, finding a perfect matching in parallel . 44 2.5.1 Proof of the isolating lemma . 44 2.5.2 Finding a perfect matching, in RNC . 44 3 Concentration of Measure 47 3.1 Lecture 13 (31/Oct): Independent rvs, Chernoff bound, applications . 47 3.1.1 Independent rvs . 47 3.1.2 Chernoff bound for uniform Bernoulli rvs (symmetric random walk) . 47 3.1.3 Application: set discrepancy . 49 3.1.4 Entropy and Kullback-Liebler divergence . 50 3.2 Lecture 14 (2/Nov): Stronger Chernoff bound, applications . 51 3.2.1 Chernoff bound using divergence; robustness of BPP . 51 3.2.2 Balls and bins . 52 3.2.3 Preview of Shannon’s coding theorem . 53 3.3 Lecture 15 (5/Nov): Application of large deviation bounds: Shannon’s coding theo- rem. Central limit theorem . 55 3.3.1 Shannon’s block coding theorem. A probabilistic existence argument. 55 3.3.2 Central limit theorem . 57 3.4 Lecture 16 (7/Nov): Application of CLT to Gale-Berlekamp. Khintchine-Kahane. Moment generating functions . 58 3.4.1 Gale-Berlekamp game . 58 3.4.2 Moment generating functions, Chernoff bound for general distributions . 59 3.5 Lecture 17 (9/Nov): Johnson-Lindenstrauss embedding `2 ! `2 ............. 62 3.5.1 Normed spaces . 63 4 Schulman: CS150 2018 CONTENTS 3.5.2 JL: the original method . 64 3.5.3 JL: a similar, and easier to analyze, method . 66 3.6 Lecture 18 (12/Nov): cont. JL embedding; Bourgain embedding . 67 3.6.1 cont. JL . 67 3.6.2 Bourgain embedding X ! Lp, p ≥ 1......................... 69 3.6.3 Embedding into L1 ................................... 70 3.7 Lecture 19 (14/Nov): cont. Bourgain embedding . 71 3.7.1 cont. Bourgain embedding: L1 ............................ 71 3.7.2 Embedding into any Lp, p ≥ 1............................ 73 3.7.3 Aside: Holder’s¨ inequality . 73 4 Limited independence 74 4.1 Lecture 20 (16/Nov): Pairwise independence, Shannon coding theorem again, second moment inequality . 74 4.1.1 Improved proof of Shannon’s coding theorem using linear codes . 74 4.1.2 Pairwise independence and the second-moment inequality . 76 4.2 Lecture 21 (19/Nov): G(n, p) thresholds . 78 4.2.1 Threshold for H as a subgraph in G(n, p) ...................... 78 4.2.2 Most pairs independent: threshold for K4 in G(n, p) ............... 78 4.3 Lecture 22 (21/Nov): Concentration of the number of prime factors; begin Khintchine- Kahane for 4-wise independence . 80 4.3.1 4-wise independent random walk . 83 4.4 Lecture 23 (26/Nov): Cont. Khintchine-Kahane for 4-wise independence; begin MIS inNC............................................... 84 4.4.1 Paley-Zygmund: solution through an in-probability bound . 84 4.4.2 Berger: a direct expectation bound . 85 4.4.3 cont. proof of Theorem 73 . 86 4.4.4 Maximal Independent Set in NC . 86 4.5 Lecture 24 (28/Nov): Cont. MIS, begin derandomization from small sample spaces . 89 4.5.1 Cont. MIS . 89 4.5.2 Descent Processes . 90 4.5.3 Cont. MIS . 90 4.5.4 Begin derandomization from small sample spaces . 91 4.6 Lecture 25 (30/Nov): Limited linear independence, limited statistical independence, error correcting codes. 92 4.6.1 Generator matrix and parity check matrix . 92 4.6.2 Constructing C from M ................................ 93 4.6.3 Proof of Thm (87) Part (1): Upper bound on the size of k-wise independent sample spaces . 94 4.6.4 Back to Gale-Berlekamp . 95 4.6.5 Back to MIS . 95 5 Schulman: CS150 2018 CONTENTS 5 Lov´aszlocal lemma 96 5.1 Lecture 26 (3/Dec): The Lovasz´ local lemma .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages111 Page
-
File Size-