"Real Analysis in Computer Science" (Fall 2013) Final Program Report Elchanan Mossel and Luca Trevisan (Organizers) Ba
Total Page:16
File Type:pdf, Size:1020Kb
"Real Analysis in Computer Science" (Fall 2013) Final Program Report Elchanan Mossel and Luca Trevisan (Organizers) Background Sophisticated tools in Real Analysis have played an increasingly central role in a number of areas in theoretical computer science over the past thirty years. As a first example, the discrete Fourier expansion can be used to analyze combinatorial problems of a discrete nature using the language and techniques of real analysis. Such Fourier analysis techniques have played a key role since the 1980s in the development of computational learning theory (PAC learning) and property testing, and, in complexity theory, they have led to circuit lower bounds and to probabilistically checkable proof (PCP) constructions and hardness of approximation results. More modern mathematical tools developed to study Banach space properties of Gaussian processes in the 1960s and 70s (in particular, the idea of hypercontractivity from functional analysis) were introduced in computer science by Kahn, Kalai and Linial in 1988. This circle of ideas played a key role in subsequent developments in coding theory, metric embeddings of graphs, and hardness of approximation of optimization problems. Finally, concepts from Gaussian geometry and Gaussian isoperimetric theory have played an implicit role in the analysis of semi-definite programs, starting with the work of Goemans and Williamson in 1995. A rigorous theory connecting optimization problems, semi-definite programs and hardness of approximation to Gaussian geometry and isoperimetry was established in the “Unique Games” framework introduced by Khot in 2002, and was further developed via the non-linear invariance principle of Mossel, O’Donnell and Oleszkiewicz (2005) and the geometry of Gaussian space. Goals The principal goal of the program was to bring together mathematicians and computer scientists to study influences, measures of complexity of discrete functions, functional inequalities, invariance principles, non-classical norms, representation theory and other modern topics in mathematical analysis, and their applications to theoretical computer science. Among the participants we were delighted to have Gil Kalai and Nati Linial, both of the Hebrew University in Jerusalem, two of the three original pioneers in introducing sophisticated analytical methods into theoretical computer science. Historically, connections between mathematicians and computer scientists working in the area were fruitful but slow to develop: often the computer scientists were unaware of the existing mathematical tools that were available, while symmetrically the mathematicians 17 often had no knowledge of the many interesting challenges coming from theoretical computer science to which their methods could potentially apply. The program thus sought to deepen and accelerate the connections between these different communities. Similarly, it was hoped that the opportunity for extended interaction and interplay between different communities within computer science (learning theory, hardness of approximation, coding theory, etc.) would lead to new and exciting developments. We shall now give some examples of how the program helped build bridges across both intellectual and geographical divides among participants. Bridging Disciplines The program enabled some key players to interact for the first time. One of the main goals of the program was to enable collaborations between some of the mathematicians working in hypercontractivity and Gaussian geometry and the computer scientists working on applications of these topics to computational problems. The first group included distinguished mathematicians Sergey Bobkov (University of Minnesota), Michel Ledoux (University of Toulouse) and Krzysztof Oleszkiewicz (University of Warsaw), and junior researchers such as Steve Heilman (UCLA) and Joe Neeman (UT Austin). The second group included Johan Håstad (KTH Stockholm), Gil Kalai (Hebrew University), Subhash Khot (NYU), Guy Kindler (Hebrew University), Nati Linial (Hebrew University), Prasad Raghavendra (UC Berkeley), Alex Samorodnitsky (Hebrew University) and Luca Trevisan (Stanford) as senior participants, and Naman Agarwal (UIUC), Anindya De (UC Berkeley), Ilias Diakonikolas (University of Edinburgh), Andrew Wan (Tsinghua University) and Karl Wimmer (Duquesne University) as junior participants. An example of an exciting result that emerged from this interaction is a paper by Joe Neeman on the topic of testing surface area [62]. In this very natural and fundamental problem, the goal is to determine whether a body has “large” or “small” surface area, using a small number of queries. Using smoothness techniques from Gaussian geometry, Neeman obtained tight results for this problem, thus solving an open question posed by Ryan O’Donnell and his collaborators who had struggled with it for more than a year and obtained only partial and much weaker results. This is exactly the kind of interaction we were hoping for: the problem of testing surface area was introduced in computer science several years ago and, while the area developed quickly, the computer scientists did not possess the mathematical tools needed to study the problem in any dimension higher than one. O’Donnell and his collaborators were the first to realize that classical results in analysis allow one to study the problem in higher dimensions, but their results were incomplete as they had a gap between the upper and lower bounds. Neeman, who is an expert on smoothness techniques on semigroups, realized that those techniques could be used to close this gap. Neeman’s paper was accepted for presentation at the ACM Symposium on Theory of Computing (STOC), one of the two premier international conferences in CS Theory, in May 2014. In the other direction, the functional J that had been introduced by Mossel and Neeman to study robust versions of the “Majority is Stablest” theorem in theoretical computer 18 science was explored by Michel Ledoux [56], who is considered the world’s leading expert on functional inequalities. Ledoux showed how the notion of “rho convexity” used in the Mossel and Neeman proof can be implicitly extended to derive, in an elegant and general manner, many of the central modern functional inequalities, including the hypercontractive and Brascamp-Lieb inequalities. During the semester, research fellow Anindya De, with collaborators, accelerated the development of a mathematical theory of distributions of linear and low degree polynomials on the discrete cube [23-25]. This work was motivated by classical applications in de-randomization such as finding a deterministic algorithm for approximating the fraction of satisfying assignments of different families of formula. De and collaborators later further developed these techniques, which led in turn to the resolution of a long-standing open problem on testing monotonicity in the two-sided error model. In his research report at the end of the program, De noted the important role of discussions with the mathematicians attending the program (especially with Oleszkiewicz), that allowed him to utilize the mathematical formulation of Malliavin calculus (stochastic calculus in infinite dimensions)—apparently for the first time in theoretical computer science. In a different direction, the interaction between Karl Wimmer and Yuval Filmus (Institute for Advanced Study), as well as Guy Kindler and Elchanan Mossel—see [30-32,73]— accelerated the development of discrete analysis on non-commutative structures. This group of researchers used sophisticated mathematics, involving among other things tools from the symmetric group and association schemes, to extend real analysis techniques to slices of the cube and other non-commutative settings. These developments led, inter alia, to several novel results in extremal combinatorics. Applications of discrete analysis to several problems in hardness of approximation and property testing were studied by Johan Håstad, Prahladh Harsha (Tata Institute, India), Dominik Scheder (Shanghai Jiao Tong University), Guy Kindler, Muli Safra (Hebrew University) and others; see, e.g., [3, 37]. Finally, Andrew Wan, Varun Kanade (UC Berkeley) and a number of workshop participants pursued applications in learning and privacy [50, 52, 71]. Bridging Geography Naturally, the semester also fostered opportunities for extended collaboration between theoretical computer scientists working in the same area. Many of the world’s leading experts on hardness of approximation and property testing participated in the program, and multiple results were obtained by groups of experts who are geographically dispersed. The group in hardness of approximation included Prahladh Harsha (India), Johan Håstad (Sweden), Guy Kindler (Israel), Prasad Raghavendra (Berkeley) and Muli Safra (Israel), along with a similarly geographically diverse group of young participants. Similarly, Anindya De (Berkeley), Ilias Diakonikolas (UK) and Rocco Servedio (NYU) continued to develop aspects of their computational theory of polynomials of random variables [23-26]. 19 Mentoring The semester provided excellent mentoring opportunities for the ten postdoctoral Fellows and other junior participants (mainly graduate students) in the program. According to survey responses, the average degree in the graph of collaborations was quite high (around 4.9; i.e., each participant collaborated on average with almost five others), with the younger participants having some of the higher degrees. In some cases, the mentoring took