Ryan Tibshirani

Carnegie Mellon University http://www.stat.cmu.edu/∼ryantibs/ Depts. of Statistics and [email protected] 229B Baker Hall 412.268.1884 Pittsburgh, PA 15213

Academic Positions

Associate Professor (Tenured), Depts. of Statistics and Machine Learning, July 2016 – present Carnegie Mellon University

Joint Appointment, Dept. of Machine Learning, Carnegie Mellon University Oct 2013 – present

Assistant Professor, Dept. of Statistics, Carnegie Mellon University Aug 2011 – June 2016

Education

Ph.D. in Statistics, . Sept 2007 – Aug 2011 Thesis: “The Solution Path of the Generalized Lasso”. Advisor: Jonathan Taylor.

B.S. in Mathematics, Stanford University. Sept 2003 – June 2007 Minor in Computer Science.

Grants, Awards, Honors

“Theoretical Foundations of Deep Learning”, Department of Defense (DoD) May 2020 – Apr 2025 Multidisciplinary University Research Initiative (MURI) grant. (co-PI, Rich Baraniuk is PI)

“Delphi Influenza Forecasting Center of Excellence”, Centers for Disease Sept 2019 — Aug 2024 Control and Prevention (CDC) grant no. U01IP001121, total award amount $3,000,000 (co-PI, Roni Rosenfeld is PI)

“Improved Nowcasting via Adaptive Boosting of Highly Variable Biosurveil- Nov 2017 – May 2020 lance Data Sources”, Defense Threat Reduction Agency (DTRA) grant no. HDTRA1-18-C-0008, total award amount $1,016,057 (co-PI, Roni Rosenfeld is PI)

Teaching Innovation Award from Carnegie Mellon University Apr 2017

“Locally Adaptive Nonparametric Estimation for the Modern Age — New July 2016 – June 2021 Insights, Extensions, and Inference Tools”, National Science Foundation (NSF) Division of Mathematical Sciences (DMS) CAREER grant no. 1554123, total award amount $400,000 (PI)

“Graph Trend Filtering for Recommender Systems”, Adobe Digital Marketing Sept 2014 – Sept 2015 Research Awards, total award amount $50,000, (co-PI, Alex Smola is PI)

1 “Advancing Theory and Computation in Statistical Learning Problems”, July 2013 – June 2016 National Science Foundation (NSF) Division of Mathematical Sciences (DMS) grant no. 1309174, total award amount $150,000 (PI)

Yahoo! Key Scientific Challenges Winner in Statistics and Machine Learning Sept 2010

Statistics Dept. Teaching Assistant Award June 2010

National Science Foundation (NSF) VIGRE Fellowship June 2007 – Aug 2010

Phi Beta Kappa June 2007

Departmental Honors in Mathematics June 2007

Distinction from Stanford University June 2007

Professional Service Committee Service

Steering Commitee for Association for Computing Machinery-Institute of 2020 – present Mathematical Statistics (ACM-IMS) Foundations of Data Science Conference

Institute for Pure and Applied Mathematics (IPAM) Scientific Advisory 2019 – present Board

Steering Commitee for Association for Computing Machinery-Institute of 2019 Mathematical Statistics (ACM-IMS) Interdisciplinary Summit on Founda- tions of Data Science

Associate Chair for Joint Statistical Meetings (JSM) 2018

Editorial Service

Associate Editor for Journal of the American Statistical Association (JASA) 2019 – present

Editor for Springer Series in the Data Sciences 2018 – present

Associate Editor for Journal of Machine Learning Research (JMLR) 2018 – present

Associate Editor for Annals of Statistics 2016 – present

Area Chair for the ML Conferences: Artificial Intelligence and Statistics 2014 – present (AISTATS), International Conference on Machine Learning (ICML), and Neural Information Processing Systems (NIPS) (usually just one conference per year)

Associate Editor for Biometrika 2013 – 2016

Associate Editor for Statistical Analysis and 2013 – 2016

2 Referee Service

Referee for Annals of Statistics, Bernoulli, Journal of the Royal Statistical So- 2010 – present ciety: Series B (JRSS-B), Journal of the American Statistical Society (JASA), Journal of Computational and Graphics Statistics (JCGS), Statistica Sinica, Biometrics, Statistics in Medicine, Journal of Machine Learning Research (JMLR), IEEE Transactions on Information Theory, IEEE Transactions on Pattern Analysis and Machine Intelligence Operations Research, Proceedings of the National Academy of Sciences (PNAS)

Referee for Neural Information Processing Systems (NIPS) and International 2013 – 2017 Conference on Machine Learning (ICML)

Panelist for National Science Foundation (NSF) Division of Mathematical 2015 Sciences (DMS) Grant Program

Referee for National Security Agency and American Mathematical Society 2013 (NSA-AMS) Grant Program

Published Articles

David Farrow, Maria Jahja, Roni Rosenfeld, and Ryan Tibshirani. “Kalman Filter, Sensor Fusion, and Constrained Regression: Equivalences and Insights”. Neural Information Processing Systems, 2019. Rina Foygel Barber, Emmanuel Candes, Aaditya Ramdas, and Ryan Tibshirani. “Conformal Prediction Under Covariate Shift”. Neural Information Processing Systems, 2019. , Robert Tibshirani, and Ryan Tibshirani. “Best Subset, Forward Stepwise, or Lasso? Analysis and Recommendations Based on Extensive Comparisons”. To appear, Statistical Science, 2020. Saharon Rosset and Ryan Tibshirani. “From Fixed-X to Random-X Regression: Bias-Variance Decomposi- tions, Covariance Penalties, and Prediction Error Estimation”. To appear, Journal of the American Statistical Association, 2020. Veeranjaneyulu Sadhanala and Ryan Tibshirani. “Additive Models with Trend Filtering”. Vol. 47, No. 6, 3032–3068, 2019. Veeranjaneyulu Sadhanala, Yu-Xiang Wang, Aaditya Ramdas, and Ryan Tibshirani. “A Higher-Order Kolmogorov Smirnov Test”. International Conference on Artificial Intelligence and Statistics, 2019.

Alnur Ali, Zico Kolter, and Ryan Tibshirani. “A Continuous-Time View of Early Stopping for Regression”. International Conference on Artificial Intelligence and Statistics, 2019. Alnur Ali and Ryan Tibshirani. “The Generalized Lasso Problem and Uniqueness.” Electronic Journal of Statistics, Vol. 13, No. 2, 2307–2347, 2019.

Logan Brooks, David Farrow, Sangwon Hyun, Ryan Tibshirani, and Roni Rosenfeld. “Nonmechanistic Forecasts of Seasonal Influenza with Iterative One-Week-Ahead Distributions”. PLOS Computational Biology, Vol. 14, No. 6, 1–29, 2018. Ryan Tibshirani and Saharon Rosset. “Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned by SURE?”. Journal of the American Statistical Association, Vol. 114, No. 526, 697–712, 2019.

Oscar Hernan Madrid Padilla, James Sharpnack, James Scott, and Ryan Tibshirani. “The DFS Fused Lasso: Linear-Time Denoising over General Graphs”. Journal of Machine Learning Research, Vol. 18, No. 176, 1–36, 2018.

3 Jing Lei, Max G’Sell, Alessandro Rinaldo, Ryan Tibshirani, and Larry Wasserman. “Distribution-Free Predictive Inference for Regression”. Journal of the American Statistical Association, Vol. 113, No. 523, 1094–1111, 2018. Sangwon Hyun, Max G’Sell, and Ryan Tibshirani. “Exact Post-Selection Inference for the Generalized Lasso Path.” Electronic Journal of Statistics, Vol. 12, 1053-1097, 2018. Ryan Tibshirani, Alessandro Rinaldo, Robert Tibshirani, Larry Wasserman. “Uniform Asymptotic Inference and the Bootstrap After Model Selection”. Annals of Statistics, Vol. 46, No. 3, 1255-1287, 2018. Veeranjaneyulu Sadhanala, Yu-Xiang Wang, James Sharpnack, and Ryan Tibshirani. “Higher-Order Total Variation Classes on Grids: Minimax Theory and Trend Filtering Methods”. Neural Information Processing Systems, 2017. Kevin Lin, James Sharpnack, Alessandro Rinaldo, and Ryan Tibshirani. “A Sharp Error Analysis for the Fused Lasso, with Application to Approximate Changepoint Screening”. Neural Information Processing Systems, 2017. Ryan Tibshirani. “Dykstra’s Algorithm, ADMM, and Coordinate Descent: Connections, Insights, and Extensions”. Neural Information Processing Systems, 2017. David Farrow, Logan Brooks, Sangwon Hyun, Ryan Tibshirani, Donald Burke, and Roni Rosenfeld. “A Human Judgment Approach to Epidemiological Forecasting”. PLOS Computational Biology, Vol. 13, No. 3, 1-19, 2017. Alnur Ali, Zico Kolter, and Ryan Tibshirani. “The Multiple Quantile Graphical Model”. Advances in Neural Information Processing Systems, 2016. Veeranjaneyulu Sadhanala, Yu-Xiang Wang, and Ryan Tibshirani. “Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Linear Smoothers”. Advances in Neural Information Processing Systems, 2016. Veeranjaneyulu Sadhanala, Yu-Xiang Wang, Alex Smola, and Ryan Tibshirani. “Graph Sparsification Approaches for Laplacian Smoothing”. International Conference on Artificial Intelligence and Statistics, 2016. Yu-Xiang Wang, James Sharpnack, Alex Smola, and Ryan Tibshirani. “Trend Filtering on Graphs”. Journal of Machine Learning Research, Vol. 17, 1-41, 2016. Ryan Tibshirani, Jonathan Taylor, Richard, Lockhart, and Robert Tibshirani. “Exact Post-selection Inference for Sequential Regression Procedures”. Journal of the American Statistical Association, Vol. 111, No. 514, 600-620, 2016. Jonathan Taylor, Joshua Loftus, and Ryan Tibshirani. “Inference in Adaptive Regression via the Kac-Rice Formula”. Annals of Statistics, Vol. 44, No. 2, 743-770, 2016. Yen-Chi Chen, Christopher Genovese, Ryan Tibshirani, and Larry Wasserman. “Nonparametric Modal Regression”. Annals of Statistics, Annals of Statistics, Vol. 44, No. 2, 489-514, 2016. Aaditya Ramdas and Ryan Tibshirani. “Fast and Flexible ADMM Algorithms for Trend Filtering”. Journal of Computational and Graphical Statistics, Vol. 25, No. 3, 839-858, 2016. Taylor Arnold and Ryan Tibshirani. “Efficient Implementations of the Generalized Lasso Dual Path Algorithm”. Journal of Computational and Graphical Statistics, Vol. 25, No. 1, 1-27, 2016. Ryan Tibshirani. “A General Framework for Fast Stagewise Algorithms”. Journal of Machine Learning Research, Vol. 16, 2543-2588, 2015. Logan Brooks, David Farrow, Sangwon Hyun, Ryan Tibshirani, and Roni Rosenfeld. “Flexible Modeling of Epidemics with an Empirical Bayes Framework”. PLOS Computational Biology, Vol. 11, No. 8, 1-18, 2015.

4 Yu-Xiang Wang, James Sharpnack, Alex Smola, and Ryan Tibshirani. “Trend Filtering on Graphs”. International Conference on Artificial Intelligence and Statistics, 2015. Ryan Tibshirani. “Degrees of Freedom and Model Search”. Statistica Sinica, Vol. 25, No. 3, 1265-1296, 2015. Yu-Xiang Wang, Alex Smola, and Ryan Tibshirani. “The Falling Factorial Basis and Its Statistical Applications”. International Conference on Machine Learning, 2014. Richard Lockhart, Jonathan Taylor, Ryan Tibshirani, and Robert Tibshirani. “A Significance Test for the Lasso”. Annals of Statistics, Vol. 42, No. 2, 413-468, 2014. Ryan Tibshirani. “Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Annals of Statistics, Vol. 42, No. 1, 285-323, 2014. Ryan Tibshirani. “The Lasso Problem and Uniqueness”. Electronic Journal of Statistics, Vol. 7, 1456-1490, 2013. Ryan Tibshirani and Jonathan Taylor. “Degrees of Freedom in Lasso Problems”. Annals of Statistics, Vol. 40, No. 2, 1198-1232, 2012. Robert Tibshirani, Jacob Bien, Jerome Friedman, Trevor Hastie, Noah Simon, Jonathan Taylor, and Ryan Tibshirani. “Strong Rules for Discarding Predictors in Lasso-Type Problems”. Journal of the Royal Statistical Society: Series B, Vol. 74, No. 2, 1-22, 2012. Ryan Tibshirani and Jonathan Taylor. “The Solution Path of the Generalized Lasso”. Annals of Statistics, Vol. 39, No. 3, 1335-1371, 2011. Ryan Tibshirani, Holger Hoefling, and Robert Tibshirani. “Nearly-Isotonic Regression”. Technometrics, Vol. 53, No. 1, 54-61, 2011. Ryan Tibshirani. “Don’t Try for the Triple 20: Where to Aim if You Are Bad at Darts”. Significance magazine, Vol. 8, No. 1, 46-48, 2011. Ryan Tibshirani, Andrew Price, and Jonathan Taylor. “A Statistician Plays Darts”. Journal of the Royal Statistical Society: Series A, Vol. 174, No. 1, 213-226, 2011. Ryan Tibshirani and Robert Tibshirani. “A Bias Correction for the Minimum Error Rate in Cross-Validation”. Annals of Applied Statistics, Vol. 3, No. 1, 822-829, 2009.

Unpublished Articles

Ryan Tibshirani. “Divided Differences, Falling Factorials, and Discrete Splines: Another Look at Trend Filtering and Related Problems.” Technical report, 2020. Guo Yu, Jacob Bien, and Ryan Tibshirani. “Reluctant Interaction Modeling”. Submitted, 2019. Alden Green, Sivaraman Balakrishnan, and Ryan Tibshirani. “Local Spectral Clustering Recovers Density Clusters”. Submitted, 2019. Trevor Hastie, Andrea Montanari, Saharon Rosset, and Ryan Tibshirani. “Surprises in High-Dimensional Ridgeless Least Squares Interpolation”. Submitted, 2019. Rina Foygel Barber, Emmanuel Candes, Aaditya Ramdas, and Ryan Tibshirani. “Predictive Inference with the Jacknife+”. Submitted, 2019. Rina Foygel Barber, Emmanuel Candes, Aaditya Ramdas, and Ryan Tibshirani. “The Limits of Distribution- Free Conditional Predictive Inference”. Submitted, 2019. Will Fithian, Jonathan Taylor, Robert Tibshirani, and Ryan Tibshirani. “Selective Sequential Model Selection”. Technical report, 2015. Ryan Tibshirani. “Fast Computation of the Median by Successive Binning”. Technical report, 2008.

5 Software

R package bestsubset: Best subset selection and related methods. Available at https://github.com/ ryantibs/best-subset. R package conformalInference: Tools for conformal inference. Available at https://github.com/ ryantibs/conformalInference. R package selectiveInference: Tools for inference after model selection. Available at https://cran. r-project.org/web/packages/selectiveInference. R package (and standalone C package) glmgen: Fast algorithms for generalized lasso problems. Available at https://github.com/statsmaths/glmgen. R package genlasso: Path algorithms for generalized lasso problems. Available at http://www.cran. r-project.org/packages/genlasso. R package darts: Statistical tools to analyze your darts game. Available at http://www.cran.r-project. org/packages/darts.

Talks

“Trend Filtering, From Univariate to Graphs, and Old and to New”, Invited Feb 2020 talk, Dept. of Statistics, University of California at Berkeley

“Divided Differences, Falling Factorials, and Discrete Splines: Another Jan 2020 Look at Trend Filtering and Related Problems”, Invited talk, Workshop on “Statistics Meets Machine Learning”, Oberwolfach

“Surprises in High-Dimensional Ridgeless Least Squares Interpolation”, In- Oct 2019 vited talk, Math and Data Seminar hosted by the Center for Data Science and the Courant Institute, New York University

“Surprises in High-Dimensional Ridgeless Least Squares Interpolation”, In- Sept 2019 vited talk, Statistics Seminar within the Dept. of Mathematics, EPFL

“Surprises in High-Dimensional Ridgeless Least Squares Interpolation”, In- Aug 2019 vited talk, Workshop on Higher-Order Asymptotics and Post-Selection Infer- ence, Washington University

“Trend Filtering on Grids”. Invited talk, Joint Statistical Meetings (JSM), Aug 2019 Denver

“A Continuous-Time View of Early Stopping for Least Squares”. Invited May 2019 talk, Symposium on Data Science and Statistics (SDSS), Seattle

“Advances and Challenges in Conformal Inference”. Invited talk, Statistics Apr 2019 Seminar within the School of Industrial and Systems Engineering, Georgia Tech

“Discrete Derivatives, Total Variation, and Trend Filtering”. Invited talk, Oct 2018 Statistics Laboratory within the Dept. of Mathematics, ETH Zurich

“LOCO: The Good, the Bad, and the Ugly (or: How I Learned to Stop Sept 2018 Worrying and Love Prediction)”. Invited talk, Workshop on Higher-Order Asymptotics and Post-Selection Inference, Washington University

6 “Penalization Versus Segmentation Methods in Changepoint Problems”. Aug 2018 Invited talk, Joint Statistical Meetings (JSM), Vancouver

“Advances and Challenges in Conformal Inference”. Invited talk, Workshop July 2018 on Model Selection, Regularization, and Inference, University of Vienna s “Dykstra’s Algorithm, ADMM, and Coordinate Descent: Connections, In- July 2018 sights, and Extensions”. Invited talk, Conference on Data Science, Statistics and Visualisation (DSSV), TU Wien

“Dykstra’s Algorithm, ADMM, and Coordinate Descent: Connections, In- June 2018 sights, and Extensions”. Invited talk, Newton Institute Workshop on Future Challenges in Statistical Scalability, Cambridge University

“Dykstra’s Algorithm, ADMM, and Coordinate Descent: Connections, In- Oct 2017 sights, and Extensions”. Invited talk, Dept. of Electrical Engineering, Stan- ford University

“Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned Oct 2017 by SURE?”. Invited talk, Dept. of Statistics, Stanford University

“Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned Oct 2017 by SURE?”. Invited talk, Dept. of Statistics within the Wharton School of Business, University of Pennsylvania

“Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned Sept 2017 by SURE?”. Invited talk, Dept. of Statistics, University of Chicago

“Total Variation Classes Beyond 1d: Minimax Rates, and the Limitations of Aug 2017 Linear Smoothers”. Invited talk, Joint Statistical Meetings (JSM), Baltimore

“Total Variation Denoising on Trees, Grids, and Graphs”. Invited talk, June 2017 Statistical Foundations Workshop on Uncertainty Quantification for Inverse Problems, Cambridge University

“Recent Advances in Trend Filtering”. Invited talk, Pre-World Congress July 2016 Meeting of New Researchers in Statistics and Probability, Toronto

“Recent Advances in Selective Inferences”. Invited talk, Statistical Society May 2016 of Canada (SSC) Annual Meeting, Niagara Falls

“Beyond 1d: Recent Advances and Challenges in Trend Filtering”. Invited Apr 2016 talk, Dept. of Statistics, Yale University

“Beyond 1d: Recent Advances and Challenges in Trend Filtering”. Invited Apr 2016 talk, Institute for Data, Systems, and Society, Massachusetts Institute of Technology

“Recent Advances in Selective Inference”. Invited talk, Dept. of Statistics, Apr 2016 Harvard University

“Beyond 1d: Recent Advances and Challenges in Trend Filtering”. In- Apr 2016 vited talk, Econometrics Group within the Dept. of Economics, Columbia University

7 “Beyond 1d: Recent Advances and Challenges in Trend Filtering”. Invited Apr 2016 talk, Dept. of Statistics within the Wharton School of Business, University of Pennsylvania

“Recent Advances in Selective Inference”. Invited talk, Statistics Group Apr 2016 within the Dept. of Operations Research and Financial Engineering, Princeton University

“Trend Filtering: Recent Advances and Challenges”. Invited talk, Statistics Mar 2016 Laboratory within the Institute for Applied Mathematics, University of Heidelberg

“Trend Filtering: Recent Advances and Challenges”. Invited talk, Statistics Mar 2016 Laboratory within the Dept. of Mathematics, ETH Zurich

“Trend Filtering: Recent Advances and Challenges”. Invited talk, Workshop Oct 2015 on High-dimensional Methods in Econometrics and Statistics, Warren Center at the University of Pennsylvannia

“A General Framework for Fast Stagewise Algorithms”. Invited talk, Inter- July 2015 national Symposium on Mathematical Programming, Pittsburgh

“Efficient Implementations of the Generalized Lasso Dual Path Algorithm”. June 2015 Invited Talk, Interface Meeting, Morgantown

“Exact-Post Selection Inference Using the Polyhedral Lemma”. Invited talk, Jan 2015 American Institute of Mathematics (AIM) Workshop on High-dimensional Inference, San Jose

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited Dec 2014 talk, Neural Information Processing Systems (NIPS) Workshop on Modern Nonparametrics, Montreal

“A General Framework for Fast Stagewise Algorithms”. Invited talk, Com- Dec 2014 putational Statistics Seminar Series, University of Chicago

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited Dec 2014 talk, Econometrics and Statistics Group within the Booth School of Business, University of Chicago

“A General Framework for Fast Stagewise Algorithms”. Invited talk, Google Nov 2014 Machine Learning Seminar Series, New York

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited Aug 2014 talk, RAND Statistics Seminar Series, Pittsburgh

“Trend Filtering on Graphs”. Invited talk, Institute of Mathematical Statis- July 2014 tics (IMS) Annual Meeting, Sydney

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited July 2014 talk, Research School of Finance, Actuarial Studies & Applied Statistics, Australian National University

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited May 2014 talk, Statistical Laboratory, Cambridge University

8 “Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited May 2014 talk, Dept. of Statistics, London School of Economics

“A General Framework for Fast Stagewise Algorithms”. Invited talk, Yahoo! May 2014 Machine Learning Seminar, University of Washington

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited May 2014 talk, Dept. of Statistics, University of Washington

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited Apr 2014 talk, CRM-ISM-GERAD Statistics Colloquium, HEC Montreal

“Adaptive Piecewise Polynomial Estimation via Trend Filtering”. Invited Nov 2013 talk, Dept. of Statistical Sciences, Cornell University

“Trevor Hastie is Almost Always Right”. Invited talk, workshop in honor of Nov 2013 Trevor Hastie for his 60th birthday, Manhattan Beach

“Fast Stagewise Algorithms for Approximate Regularization Paths”. Con- Aug 2013 tributed talk, Joint Statistical Meetings (JSM), Montreal

“Fast Stagewise Algorithms for Approximate Regularization Paths”. Invited June 2013 talk, International Chinese Statistical Association (ICSA) and International Society for Biopharmaceutical Statistics (ISBS) Joint Conference, Bethesda

“Fast Stagewise Algorithms for Approximate Regularization Paths”. Invited May 2013 talk, Statistical Society of Canada (SSC) Annual Meeting, Edmonton

“Trend Filtering”. Invited talk, Dept. of Information, Operations & Man- Nov 2012 agement Sciences, New York University

“Trend Filtering”. Invited talk, Dept. of Biostatistics, University of Pitts- Sept 2012 burgh

“Degrees of Freedom in Lasso Problems”. Invited talk, International Chinese June 2012 Statistical Assocation (ICSA) Applied Statistics Symposium, Boston

“Degrees of Freedom in Statistical Machine Learning”. Invited talk, Machine May 2012 Learning Lunch Seminar, Carnegie Mellon University

“Convex Geometry and Degrees of Freedom”. Invited talk, Workshop on Apr 2012 Large Scale Statistical Inference, University of Minnesota

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Opera- Feb 2011 tions Research and Financial Engineering, Princeton University

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Statis- Feb 2011 tics, Carnegie Mellon University

Dept. of Statistics and Biostatistics, Rutgers University Feb 2011

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Statis- Feb 2011 tics, Cornell University

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Statis- Feb 2011 tics, Columbia University

9 “The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Statis- Feb 2011 tics, University of Chicago

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Statistics Feb 2011 and Actuarial Science,

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Statis- Feb 2011 tics, University of California at Irvine

“The Solution Path of the Generalized Lasso”. Invited talk, Dept. of Infor- Jan 2011 mation and Operations Management, University of Southern California

“A Statistician Plays Darts”. Contributed talk, Royal Statistical Society Sept 2010 (RSS) Annual Conference, Brighton

“Regularization Paths for Least Squares Problems with Generalized `1 Penal- June 2010 ties”. Contributed talk, Western North American Region (WNAR) Annual Conference, Seattle

“Regularization Paths for Least Squares Problems with Generalized `1 Penal- June 2010 ties”. Invited talk, Yahoo! Research, Santa Clara

“A Statistician Plays Darts”. Invited talk, Dept. of Statistics, Stanford Mar 2010 University

“A Statistician Plays Darts”. Invited talk, Molecular Profiling Colloquium, July 2009 Stanford University

“Automatic Gating Tools for the Analysis of Flow Cytometry Data”. Con- Oct 2006 tributed talk, Biomedical Computation at Stanford (BCATS) Conference, Stanford University

Teaching Instructor, Carnegie Mellon University

36-350: Statistical Computing Fall 2019

10-725: Convex Optimization Fall 2019

36-350: Statistical Computing Fall 2018

10-725: Convex Optimization Fall 2018

36-350: Statistical Computing Spring 2018

36-702/10-702: Statistical Machine Learning (with Larry Wasserman) Spring 2017

36-350: Statistical Computing Fall 2016

36-725/10-725: Convex Optimization (with Javier Pena) Fall 2016

36-350: Statistical Computing Fall 2015

36-725/10-725: Convex Optimization Fall 2015

36-702/10-702: Statistical Machine Learning (with Larry Wasserman) Spring 2015

10 36-725/10-725: Convex Optimization Spring 2015

36-825: Statistics Journal Club (with Rob Tibshirani) Fall 2014

36-402: Advanced Methods for Data Analysis Spring 2014

36-702/10-702: Statistical Machine Learning (with Larry Wasserman) Spring 2014

36-725/10-725: Convex Optimization (with Barnabas Poczos) Fall 2013

36-462: Data Mining Spring 2013

36-725/10-725: Optimization (with Geoff Gordon) Fall 2012

36-462: Data Mining Spring 2012

36-401: Modern Regression (with Rebecca Nugent) Fall 2011

36-825: Statistics Journal Club (with Rob Kass) Fall 2011

Teaching Assistant, Stanford University

Statistics 300C: Theory of Statistics Spring 2011

Statistics 110: Statistical Methods in Engineering and the Physical Sciences Fall 2010

Statistics 300D: Advanced Topics in Statistics Summer 2010

Statistics 60: Introduction to Statistical Methods Fall 2009

Mentor for the VIGRE Undergraduate Research Program Summer 2009

Statistics 47N: Breaking the Code? Fall 2008

Statistics 252: Data Mining and Electronic Business Spring 2008

11