56Thannualreport
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Facilitating Bayesian Continual Learning by Natural Gradients and Stein Gradients
Facilitating Bayesian Continual Learning by Natural Gradients and Stein Gradients Yu Chen∗ Tom Diethe Neil Lawrence University of Bristol Amazon Amazon [email protected] [email protected] [email protected] Abstract Continual learning aims to enable machine learning models to learn a general solution space for past and future tasks in a sequential manner. Conventional models tend to forget the knowledge of previous tasks while learning a new task, a phenomenon known as catastrophic forgetting. When using Bayesian models in continual learning, knowledge from previous tasks can be retained in two ways: (i) posterior distributions over the parameters, containing the knowledge gained from inference in previous tasks, which then serve as the priors for the following task; (ii) coresets, containing knowledge of data distributions of previous tasks. Here, we show that Bayesian continual learning can be facilitated in terms of these two means through the use of natural gradients and Stein gradients respectively. 1 Background There are several existing approaches for preventing catastrophic forgetting of regular (non- Bayesian) Neural Networks (NNs) by constructing a regularization term from parameters of pre- vious tasks, such as Elastic Weight Consolidation (EWC) [1] and Synaptic Intelligence (SI) [2]. In the Bayesian setting, Variational Continual Learning (VCL) [3] proposes a framework that makes use of Variational Inference (VI). LVCL (θ) = Eqt(θ) [log p (Dtjθ)] − KL (qt(θ)kqt−1(θ)) : (1) The objective function is as in Equation (1), where t is the index of tasks, qt(θ) represents the approximated posterior of parameters θ of task t and Dt is the data of task t. -
Rules for Candidates Wishing to Apply for a Two Year
GENERAL 2022 1. Up to fifty Marshall Scholarships will be awarded in 2022. They are tenable at any British university and for study in any discipline at graduate level, leading to the RULES FOR CANDIDATES WISHING TO award of a British university degree. Conditions APPLY FOR A TWO YEAR MARSHALL governing One Year Scholarships are set out in a SCHOLARSHIP ONLY. separate set of Rules. Marshall Scholarships finance young Americans of high 2. Candidates are invited to indicate two preferred ability to study for a degree in the United Kingdom in a universities, although the Marshall Commission reserves system of higher education recognised for its excellence. the right to decide on final placement. Expressions of interest in studying at universities other than Oxford, Founded by a 1953 Act of Parliament, Marshall Cambridge and London are particularly welcomed. Scholarships are mainly funded by the Foreign, Candidates are especially encouraged to consider the Commonwealth and Development Office and Marshall Partnership Universities. A course search commemorate the humane ideals of the Marshall Plan facility is available here: conceived by General George C Marshall. They express https://www.marshallscholarship.org/study-in-the- the continuing gratitude of the British people to their uk/course-search American counterparts. NB: The selection of Scholars is based on our The objectives of the Marshall Scholarships are: published criteria: https://www.marshallscholarship.org/apply/criteria- • To enable intellectually distinguished young and-who-is-eligible This includes, under the Americans, their country’s future leaders, to study in academic criteria, a range of factors, including a the UK. candidate’s choice of course, choice of university, and academic and personal aptitude. -
BXAO Cat 1971.Pdf
SOUTHWESTERN AT OXFORD Britain in the Renaissance A Course of Studies in the Arts, Literature, History, and Philosophy of Great Britain. July 4 through August 15, 1971, University College, Oxford University. OFFICERS AND TUTORS President John Henry Davis, A.B., University of Kentucky; B.A. and M.A., Oxford University; Ph.D., University of Chicago. Dean Yerger Hunt Clifton, B.A., Duke University; M.A., University of Virginia; Ph.D., Trinity College, Dublin. Tutors George Marshall Apperson, Jr., B.S., Davidson College; B.D., Th.M., Th.D., Union Theological Seminary, Virginia. Mary Ross Burkhart, B.A., University of Virginia; M.A., University of Ten nessee. James William Jobes, B.A., St. John's College, Annapolis; Ph.D., University of Virginia. James Edgar Roper, B.A., Southwestern At Memphis; B.A. and M.A., Oxford University; M.A., Yale University. UNIVERSITY COLLEGE, OXFORD UNIVERSITY Master Redcliffe-Maud of Bristol, The Right Honorourable John Primatt Redcliffe, Baron, M.A. Dean John Leslie Mackie, M.A. Librarian Peter Charles Bayley, M.A. Chaplain David John Burgess, M.A. Domestic Bursar Vice Admiral Sir Peter William Gretton, M.A. University College is officially a Royal Foundation, and the Sovereign is its Visitor. Its right to this dignity, based on medieval claims that it was founded by King Alfred the Great, has twice been asserted, by King Richard II in 1380 and by the Court of King's Bench in 1726. In fact, the college owes its origin to William of Durham who died in 1249 and bequeathed 310 marks, the income from which was to be employed to maintain 10 or more needy Masters of Arts studying divinity. -
Covariances, Robustness, and Variational Bayes
Journal of Machine Learning Research 19 (2018) 1-49 Submitted 11/17; Revised 7/18; Published 8/18 Covariances, Robustness, and Variational Bayes Ryan Giordano [email protected] Department of Statistics, UC Berkeley 367 Evans Hall, UC Berkeley Berkeley, CA 94720 Tamara Broderick [email protected] Department of EECS, MIT 77 Massachusetts Ave., 38-401 Cambridge, MA 02139 Michael I. Jordan [email protected] Department of Statistics and EECS, UC Berkeley 367 Evans Hall, UC Berkeley Berkeley, CA 94720 Editor: Mohammad Emtiyaz Khan Abstract Mean-field Variational Bayes (MFVB) is an approximate Bayesian posterior inference technique that is in- creasingly popular due to its fast runtimes on large-scale data sets. However, even when MFVB provides accurate posterior means for certain parameters, it often mis-estimates variances and covariances. Further- more, prior robustness measures have remained undeveloped for MFVB. By deriving a simple formula for the effect of infinitesimal model perturbations on MFVB posterior means, we provide both improved covariance estimates and local robustness measures for MFVB, thus greatly expanding the practical usefulness of MFVB posterior approximations. The estimates for MFVB posterior covariances rely on a result from the classical Bayesian robustness literature that relates derivatives of posterior expectations to posterior covariances and includes the Laplace approximation as a special case. Our key condition is that the MFVB approximation provides good estimates of a select subset of posterior means—an assumption that has been shown to hold in many practical settings. In our experiments, we demonstrate that our methods are simple, general, and fast, providing accurate posterior uncertainty estimates and robustness measures with runtimes that can be an order of magnitude faster than MCMC. -
VIRTUAL ASPIRE 2021 Building Success Through the Liberal Arts Building Success Through the Liberal Arts
COLLEGE OF ARTS, HUMANITIES, AND SOCIAL SCIENCES UNIVERSITY PRESENTS VIRTUAL ASPIRE 2021 Building Success Through the Liberal Arts Building Success through the Liberal Arts Vision Statement The goal of the Aspire program is to empower students to appreciate, articulate, and leverage the intellectual skills, knowledge, and dispositions unique to a liberal arts education in the service of their personal and professional development. Participants will learn to convey the core values and strengths of their degree program, identify career paths that may connect to that program, and prepare themselves to fur- ther pursue passions and opportunities upon completing their degrees. Thank you to Boston College, Endeavor: The Liberal Arts Advantage for Sophomores, for inspiration and activity ideas. 2 Contents Schedule Overview 4-5 CoAHSS 6-9 Dean’s Advisory Board 10-21 Connect with Us! Guest Speakers 22-24 Campus Resources 25-26 @WPCOAHSS Thank You 27 “What we think, we become.” -Buddha 3 Schedule Overview In-Person Evening Program: Monday, August 2nd Student Center. Rm. 211 5:30pm-6:30pm: Welcome: Program Overview/Introduction: Speakers: o Dr. Wartyna Davis, Dean, College of Arts, Humanities, and Social Science o Dr. Joshua Powers, Provost and Senior Vice President, William Paterson University o Valerie Gross, Dean’s Advisory Board Chair o Selected Student from Aspire 2020, Zhakier Seville Reception: Light Refreshments VIRTUAL Day One Tuesday, August 3th from 9:00am to 2:35pm 9:00– 9:05am Welcome: Dr. Ian Marshall and Lauren Agnew 9:05am-10:00am Virtual Workshops: Career Foundations Group A: The Liberal Arts Advantage: Understanding Yourself through the Strong Interest Inventory Assessment with Ms. -
Fellowships Flowchart FA17
A unit of undergraduate studies WWW.FELLOWSHIPS.KU.EDU A service for all KU undergraduates If you are: With a GPA: Interested in funding for: Consider applying for: (details on back) Graduate Study Rhodes, Marshall, Mitchell, Gates-Cambridge, Soros, Schwarzman 3.7+ Science/Engineering Also: Churchill, NSF Graduate Research Fellowship Public Service Knight-Hennessy Scholars 4th/5th Year International/Language Boren, CLS, DAAD, Fulbright, Gilman 3.2+ Public Service Carnegie, Pickering Science/Engineering/SocSci NSF Graduate Research Fellowship Graduate Study Rhodes, Marshall, Mitchell, Gates-Cambridge, Schwarzman 3.7+ Science/Engineering Also: Astronaut Scholarship, Goldwater Scholarship, Churchill Public Service Truman Scholarship (3.5+) 3rd Year Public Service Pickering, Udall Scholarship 3.2+ International/Language Boren, CLS, DAAD, Gilman International/Language Fulbright Scholarship 3.7+ Science/Engineering Astronaut Scholarship, Goldwater Scholarship 2nd Year 3.2+ International/Language Boren, CLS, DAAD, Fulbright UK Summer, (3.5+), Gilman 1st Year 3.2+ International/Language Boren, CLS, Fulbright UK Summer, (3.5+), Gilman 1506 Engel Road • Lawrence, KS 66045-3845 • 785-864-4225 KU National Fellowship Advisors ✪ Michele Arellano Office of Study Abroad WE GUIDE STUDENTS through the process of applying for nationally and internationally Boren and Gilman competitve fellowships and scholarships. Starting with information sessions, workshops and early drafts [email protected] of essays, through application submission and interview preparation, we are here to help you succeed. ❖ Rachel Johnson Office of International Programs Fulbright Programs WWW.FELLOWSHIPS.KU.EDU [email protected] Knight-Hennessy Scholars: Full funding for graduate study at Stanford University. ★ Anne Wallen Office of Fellowships Applicants should demonstrate leadership, civic commitment, and want to join a Campus coordinator for ★ awards. -
2018 Annual Report Alfred P
2018 Annual Report Alfred P. Sloan Foundation $ 2018 Annual Report Contents Preface II Mission Statement III From the President IV The Year in Discovery VI About the Grants Listing 1 2018 Grants by Program 2 2018 Financial Review 101 Audited Financial Statements and Schedules 103 Board of Trustees 133 Officers and Staff 134 Index of 2018 Grant Recipients 135 Cover: The Sloan Foundation Telescope at Apache Point Observatory, New Mexico as it appeared in May 1998, when it achieved first light as the primary instrument of the Sloan Digital Sky Survey. An early set of images is shown superimposed on the sky behind it. (CREDIT: DAN LONG, APACHE POINT OBSERVATORY) I Alfred P. Sloan Foundation $ 2018 Annual Report Preface The ALFRED P. SLOAN FOUNDATION administers a private fund for the benefit of the public. It accordingly recognizes the responsibility of making periodic reports to the public on the management of this fund. The Foundation therefore submits this public report for the year 2018. II Alfred P. Sloan Foundation $ 2018 Annual Report Mission Statement The ALFRED P. SLOAN FOUNDATION makes grants primarily to support original research and education related to science, technology, engineering, mathematics, and economics. The Foundation believes that these fields—and the scholars and practitioners who work in them—are chief drivers of the nation’s health and prosperity. The Foundation also believes that a reasoned, systematic understanding of the forces of nature and society, when applied inventively and wisely, can lead to a better world for all. III Alfred P. Sloan Foundation $ 2018 Annual Report From the President ADAM F. -
Martin Hairer: Fields Medalist
Volume 43 • Issue 6 IMS Bulletin September 2014 Martin Hairer: Fields Medalist The Fields Medals are the most prestigious awards in the field of mathematics, CONTENTS awarded every four years by the International Mathematical Union. Among this year’s 1 Fields Medal: Martin Hairer four recipients is IMS member Martin Hairer (University of Warwick, UK), who deliv- ered a Medallion lecture at the IMS annual meeting in Sydney in July. 2-3 Members’ News: Michael Jordan; Susie Bayarri; Terry Ofer Zeitouni explains some of the background to Martin’s work: Speed; Karen Kafadar; John Martin Hairer of the University of Warwick is one Stufken; ISBA Fellows; ASA of the four recipients of the 2014 Fields medal, awarded Fellows on August 13 during the International Congress of 4 Annual Meeting photos Mathematicians in Seoul. The citation reads:Martin Hairer is awarded a Fields Medal for his outstanding 6 IMS student members win Badge Stiftung/Peter Tschira Klaus contributions to the theory of stochastic partial differential Martin Hairer Data Mining Cup (above) and the equations, and in particular for the creation of a theory Fields Medal 7 Hadley Wickham: How are of regularity structures for such equations. (left) Data Science and Statistics For probabilists and statisticians, the solution of a different? stochastic ordinary differential equations involves Itô’s theory; while extremely powerful 8 Student Puzzle Corner 5 and useful, Itô’s method is crucially based on the martingale property of Brownian 10 Robert Adler: TOPOS part 2 motion and does not generalize well to situations where the noise depends both on time and space. -
Dartmouth College • Fellowship Advising
DARTMOUTH COLLEGE • FELLOWSHIP ADVISING NATIONAL SCHOLARSHIPS AND FELLOWSHIPS BY YEAR OF APPLICATION Sophomores Critical Language Scholarship Fulbright Summer Institute (programs at UK universities) Gilman Scholarship (study abroad programs) Goldwater Scholarship (research careers in STEM fields) Boren/NSEP Scholarship for Study Abroad Udall Scholarship (careers in the environment, Native health care, or tribal public policy) Juniors Critical Language Scholarship Gilman Scholarship (study abroad programs) Goldwater Scholarship (research careers in STEM fields) Udall Scholarship (careers in the environment, Native health care, or tribal public policy) Truman Scholarship (careers in public service) Beinecke Scholarship (graduate study in the arts, social sciences or humanities) Boren/NSEP Scholarship for Study Abroad Pickering Fellowship (careers in the Foreign Service) Seniors and Alumni Study Abroad: Rhodes Scholarship (study at Oxford) Marshall Scholarship (study in the UK) Mitchell Scholarship (study in Ireland) Fulbright Research and Teaching Assistantship (study or teaching in approx. 140 countries) Churchill Scholarship (study at Cambridge, STEM fields) DAAD Scholarship (study in Germany) Gates Cambridge Scholarship (study at Cambridge) Gilman Scholarship (study abroad programs) Keasbey Scholarship (study at Oxford, Cambridge, Edinburgh, or Wales) Luce Scholarship (internships in East Asia, any field except Asian specialties) Schwarzman Scholars Program (at Tsinghua University in Beijing) St. Andrew’s Society Graduate Scholarship -
Parallel Streaming Wasserstein Barycenters
Parallel Streaming Wasserstein Barycenters Matthew Staib, Sebastian Claici, Justin Solomon, and Stefanie Jegelka Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology {mstaib, sclaici, jsolomon, stefje}@mit.edu Abstract Efficiently aggregating data from different sources is a challenging problem, particularly when samples from each source are distributed differently. These differences can be inherent to the inference task or present for other reasons: sensors in a sensor network may be placed far apart, affecting their individual measurements. Conversely, it is computationally advantageous to split Bayesian inference tasks across subsets of data, but data need not be identically distributed across subsets. One principled way to fuse probability distributions is via the lens of optimal transport: the Wasserstein barycenter is a single distribution that summarizes a collection of input measures while respecting their geometry. However, computing the barycenter scales poorly and requires discretization of all input distributions and the barycenter itself. Improving on this situation, we present a scalable, communication-efficient, parallel algorithm for computing the Wasserstein barycenter of arbitrary distributions. Our algorithm can operate directly on continuous input distributions and is optimized for streaming data. Our method is even robust to nonstationary input distributions and produces a barycenter estimate that tracks the input measures over time. The algorithm is semi-discrete, needing to discretize only the barycenter estimate. To the best of our knowledge, we also provide the first bounds on the quality of the approximate barycenter as the discretization becomes finer. Finally, we demonstrate the practical effectiveness of our method, both in tracking moving distributions on a sphere, as well as in a large-scale Bayesian inference task. -
Quantifying Uncertainty and Robustness at Scale Tamara Broderick ITT Career Development Assistant Professor [email protected]
Bayesian Machine Learning Quantifying uncertainty and robustness at scale Tamara Broderick ITT Career Development Assistant Professor [email protected] Raj Agrawal Trevor Campbell Lorenzo Masoero Will Stephenson Microcredit Experiment 1 Microcredit Experiment • Simplified from Meager (2016) 1 Microcredit Experiment • Simplified from Meager (2016) • 7 sites with microcredit trials (in Mexico, Mongolia, Bosnia, India, Morocco, Philippines, Ethiopia) [amcharts.com 2016] 1 Microcredit Experiment • Simplified from Meager (2016) • 7 sites with microcredit trials (in Mexico, Mongolia, Bosnia, India, Morocco, Philippines, Ethiopia) • ~900 to ~17K businesses at each site [amcharts.com 2016] 1 Microcredit Experiment • Simplified from Meager (2016) • 7 sites with microcredit trials (in Mexico, Mongolia, Bosnia, India, Morocco, Philippines, Ethiopia) • ~900 to ~17K businesses at each site • Q: how much does microcredit increase business profit? [amcharts.com 2016] 1 Microcredit Experiment • Simplified from Meager (2016) • 7 sites with microcredit trials (in Mexico, Mongolia, Bosnia, India, Morocco, Philippines, Ethiopia) • ~900 to ~17K businesses at each site • Q: how much does microcredit increase business profit? τ [amcharts.com 2016] 1 Microcredit Experiment • Simplified from Meager (2016) • 7 sites with microcredit trials (in Mexico, Mongolia, Bosnia, India, Morocco, Philippines, Ethiopia) • ~900 to ~17K businesses at each site • Q: how much does microcredit increase business profit? τ • Desiderata: [amcharts.com 2016] 2 Microcredit Experiment • Simplified -
Ryan Giordano Runjing Liu Nelle Varoquaux Michael I. Jordan
Measuring Cluster Stability for Bayesian Nonparametrics Using the Linear Bootstrap Ryan Giordano1* Runjing Liu1* Nelle Varoquaux1* Michael I. Jordan1 Tamara Broderick2 * These authors contributed equally 1 Department of Statistics, UC Berkeley 2 Department of EECS, MIT Overview Results A cold start refers to doing 10 random restarts for each bootstrap sample. A warm start • We employ a Bayesian nonparametric model to cluster time-course gene expres- refers to starting from a (high quality) optimum found for the full data set. sion data, and do inference using mean field variational Bayes. • To assess the clustering stability of our results, one approach is to do bootstrap Speed comparisons: sampling. However, this is computationally expensive and requires fitting new VB Total time (sec) Time per fit (sec) parameters to each simulated data-set. (200 bootstrap samples) Initial fit (200 random restarts) – 16100 • Therefore, we propose a fast, automatic approximation to a full bootstrap analy- Full bootstrap (cold start) 184000 931 sis based on the infinitesimal jackknife [1]. We call this alternative bootstrap Full bootstrap (warm start) 10800 53.4 analysis the linear bootstrap. Hessian inverse (for linear bootstrap) – 12.7 Linear bootstrap (given Hessian inverse) 0.0284 0.000145 Data We study data from [4] wherein mice were infected with influenza virus, and gene • The linear bootstrap is orders of magnitudes faster. expressions were measured at 14 time points after infection. • The full bootstrap requires re-optimizing, while the linear approximation requires a one time computation and factorization of the KL Hessian [2]. • The KL Hessian can be easily computed with modern auto-differentiation tools [3].