Artificial Intelligence for Robust Engineering & Science January 22
Total Page:16
File Type:pdf, Size:1020Kb
Artificial Intelligence for Robust Engineering & Science January 22 – 24, 2020 Organizing Committee General Chair: Logistics Chair: David Womble Christy Hembree Program Director, Artificial Intelligence Project Support, Artificial Intelligence Oak Ridge National Laboratory Oak Ridge National Laboratory Committee Members: Jacob Hinkle Justin Newcomer Research Scientist, Computational Sciences Manager, Machine Intelligence and Engineering Sandia National Laboratories Oak Ridge National Laboratory Frank Liu Clayton Webster Distinguished R&D Staff, Computational Distinguished Professor, Department of Sciences and Mathematics Mathematics Oak Ridge National Laboratory University of Tennessee–Knoxville Robust engineering is the process of designing, building, and controlling systems to avoid or mitigate failures and everything fails eventually. This workshop will examine the use of artificial intelligence and machine learning to predict failures and to use this capability in the maintenance and operation of robust systems. The workshop comprises four sessions examining the technical foundations of artificial intelligence and machine learning to 1) Examine operational data for failure indicators 2) Understand the causes of the potential failure 3) Deploy these systems “at the edge” with real-time inference and continuous learning, and 4) Incorporate these capabilities into robust system design and operation. The workshop will also include an additional session open to attendees for “flash” presentations that address the conference theme. AGENDA Wednesday, January 22, 2020 7:30–8:00 a.m. │ Badging, Registration, and Breakfast 8:00–8:45 a.m. │ Welcome and Introduction Jeff Nichols, Oak Ridge National Laboratory David Womble, Oak Ridge National Laboratory 8:45–9:30 a.m. │ Keynote Presentation: Dave Brooks, General Motors Company AI for Automotive Engineering 9:30–10:00 a.m. │ Group Photo and Break Session 1: Finding the right needles in noisy haystacks Session Chair: Frank Liu, Oak Ridge National Laboratory Identifying early failure indicators from noisy sensor data is a crucial step to ensure robustness and resilience of complex engineering systems. The same methodology can also be applied to identify critical transition points in natural phenomena. Traditionally, time series data analysis methods have been the workhorse. The objective of this session is not only to identify the appropriate engineering/data science solutions, but also to discuss fundamental science questions such as the observability of highly nonlinear systems from noisy and sparse data. • Can Machine Learning methods be used effectively to indicate or predict failure? • Can Machine Learning fundamentally transform research and practice of early failure detection? 10:00–10:10 a.m. │ Session Introduction 10:10–10:50 a.m. │ Speaker 1–1: Siva Rajamanickam, Sandia National Laboratories Machine Learning in the presence of noise: Early experiments 10:50–11:20 a.m. │ Speaker 1–2: Peng Li, University of California-Santa Barbara Data-Efficient Robust Anomaly Detection: A Machine Learning Approach 11:20–11:50 a.m. │ Speaker 1–3: Kody Law, University of Manchester Data Centric (AI for) Science and Engineering in the UK 11:50–12:20 p.m. │ Speaker 1–4: Helen Li, Duke University Machine Learning in Modern Water Inspection and Chip Design 12:20–12:30 p.m. │ Session Wrap Up 12:30–1:30 p.m. │ Working Lunch with Breakout Discussions Session 2: Skip the search–from finding needles to understanding needles Session Chair: Justin Newcomer, Sandia National Laboratories Today, robust engineering of complex systems requires significant investments in design, production, and lifecycle monitoring. Extensive testing is conducted to uncover performance issues, degradations, and precursors to failure - searching for a finite number of needles in an exponentially growing stack of hay - over the life of each system. Identification of anomalies often leads to follow-on investigations to determine the performance impact, root cause, and extent of condition. These reactive investigations are time and labor intensive but are essential for credible decision making. Can AI provide a deep causal understanding of complex systems? Can AI guide collection of the right data at the right time? If so, this will lead to revolutionary advancements such as rapid autonomous failure analyses, learning physical models and design processes to mitigate or eliminate failure mechanisms, anticipating emergent behaviors not traceable to design decisions or requirements, and real-time trusted decision support. This session will explore advancements needed in causal inference, planning under uncertainty, model credibility, reinforcement learning, and human computer interaction to significantly disrupt the current engineering process reliant on an endless search for needles. • Can artificial intelligence provide a deep causal understanding of complex systems? • Can artificial intelligence guide collection of the right data at the right time? 1:30–1:40 p.m. │ Session Introduction 1:40–2:10 p.m. │ Speaker 2–1: Laura McNamara, Sandia National Laboratories Adoption Challenges in Artificial Intelligence and Machine Learning: why technology acceptance is so hard (and what we can do about it) 2:10–2:40 p.m. │ Speaker 2–2: Chuck Farrar, Los Alamos National Laboratory Machine Learning Approaches to Structural Health Monitoring Data Normalization 2:40–3:10 p.m. │ Speaker 2–3: Eli Sherman, Johns Hopkins University Formal Methods for Addressing Data Complications 3:10–3:50 p.m. │ Networking Break 3:50–4:20 p.m. │ Speaker 2–4: Aurora Schmidt, Johns Hopkins University Applied Physics Laboratory A Case Study in Safety Constraints to Machine Learning-Based Controllers 4:20–4:30 p.m. │ Session Wrap Up 4:30–5:00 p.m. │ Day 1 Wrap Up 5:00–7:00 p.m. │ Welcome Reception Thursday, January 23, 2020 7:30–8:15 a.m. │ Breakfast 8:15–8:20 a.m. │ Day 2 Introduction Session 3: Running in the wild – forget the past and do it fast with online machine learning Session Chair: Clayton Webster, University of Tennessee–Knoxville Almost all physical systems are instrumented, and data is being generated in huge quantities. Predicting failure in large-scale engineered systems requires the exploitation of such data with tools designed to handle data with high volume, velocity, and variety. Data is not static but typically arrives as a stream of sequentially ordered samples. Due to the high volume, data can be used only once in-situ and cannot be saved for later learning. The objective of this session is to explore machine learning techniques focused on continuous learning using a limited amount of memory in a limited amount of time, while retaining the ability to perform predictions at any point in time. • How can machine learning enable continuous, dynamic, and short-term learning and prediction for an effective strategy when operating in very fast and dynamic environments? 8:20–8:30 a.m. │ Session Introduction 8:30–9:00 a.m. │ Speaker 3–1: Wilkins Aquino, Duke University Model-Based Learning of Advection-Diffusion Transport using Mobile Robots 9:00–9:30 a.m. │ Speaker 3–2: Abhinav Saxena, GE Research - AI & Learning Systems AI Spectrum for Predictive Maintenance 9:30–10:00 a.m. │ Networking Break 10:00–10:30 a.m. │ Speaker 3–3: Nagi Rao, Oak Ridge National Laboratory Practice of Machine Learning Theory: Case Studies from Nuclear Reactors and Computing Infrastructures 10:30–11:00 a.m. │ Speaker 3–4: Mingzhou Jin, University of Tennessee - Knoxville Geometrical Defect Detection for Additive Manufacturing with Machine Learning Models 11:00–11:10 a.m. │ Session Wrap Up Session 4: Flash Speaker Presentations Session Chairs: Danny Dunlavy & David Stracuzzi, Sandia National Laboratories 11:10–11:15 a.m. │ Session Introduction 11:15–11:30 a.m. │ Speaker 4–1: Michelle Quirk, DOE/NNSA AI–Complete Problems 11:30–11:45 a.m. │ Speaker 4–2: Warren Davis, Sandia National Laboratories In-Situ Anomaly Detection for Intelligent Data Capture in HPC Simulations 11:45–12:00 p.m. │ Speaker 4–3: Iris Bahar, Brown University A Simulation Framework for Capturing Thermal Noise-Induced Failures in Low-Voltage CMOS SRAM 12:00–1:00 p.m. │ Working Lunch with Presentation by Dave Keim, Oak Ridge National Laboratory The History of ORNL 1:00–1:15 p.m. │ Speaker 4–4: Shawn Sheng, National Renewable Energy Laboratory SCADA Data Modeling for Wind Turbine Gearbox Failure Detection using ML and Big Data Technologies 1:15–1:30 p.m. │ Speaker 4–5: Robert Patton, Oak Ridge National Laboratory Artificial Intelligence for Autonomous Vehicles 1:30–1:45 p.m. │ Speaker 4–6: Ahmedullah Aziz, University of Tennessee–Knoxville Reliability Concerns in Emerging Neuromorphic Hardware 1:45–2:00 p.m. │ Speaker 4–7: Emily Donahue, Sandia National Laboratories Identifying Defects in CT Scans without Labelled Data 2:00–2:15 p.m. │ Speaker 4–8: David Mascarenas, National Security Engineering Center Video-Based, High Resolution, High Sensitivity Structural Health Monitoring 2:15–2:30 p.m. │ Speaker 4–9: Steve Sun, Columbia University Non-cooperative Game for Learning from Non-Euclidean Microstructural Data for Computational Solid Mechanics 2:30–3:00 p.m. │ Networking Break 3:00–3:15 p.m. │ Speaker 4–10: John Lindberg, Electric Power Research Institute Data Science in the Nuclear Industry 3:15–3:30 p.m. │ Speaker 4–11: Minsik Cho, IBM SNOW: Subscribing to Knowledge via Channel Pooling for Transfer & Lifelong/Continual Learning 3:30–3:45 p.m. │ Speaker 4–12: Draguna Vrabie, Pacific Northwest National Laboratory Learning and Deception – Robust Control 3:45–4:00 p.m. │ Speaker 4–13: Vivek Sarkar, Georgia Institute of Technology Using AI to Improve Robustness and Productivity of Engineering & Science Software 4:00–4:15 p.m. │ Speaker 4–14: Rick Archibald, Oak Ridge National Laboratory Machine Learning for Scientific Data 4:15–4:30 p.m. │ Speaker 4–15: Geoffrey Fox, Indiana University Deep Learning Enhanced Simulation 4:30–4:45 p.m. │ Speaker 4–16: Mariam Kiran, Lawrence Berkeley National Laboratory Using AI to ESnet, the High-Performance Science Network 4:45–5:00 p.m.