The Edge of Cloud a Discussion with Mahadev Satyanarayanan, the Carnegie Group University Professor of Computer Science
Total Page:16
File Type:pdf, Size:1020Kb
FEATURE The edge of cloud A discussion with Mahadev Satyanarayanan, the Carnegie Group University professor of computer science Diana Kearns-Manolatos, Myke Miller, David Linthicum THE DELOITTE CENTER FOR INTEGRATED RESEARCH The edge of cloud: A discussion with Mahadev Satyanarayanan, the Carnegie Group University professor of computer science Five cloud-edge trends and how organizations can stay ahead of them N NOVEMBER 2020, the Deloitte Cloud Institute More recently, Satya authored “The computing invited Mahadev Satyanarayanan, the Carnegie landscape of the 21st century,”5 in which he IGroup University Professor of Computer Science introduced a model to organize the distributed at Carnegie Mellon University, as the first Deloitte computing universe into four segments: (1) cloud Cloud Institute Fellow.1 Dr. Satyanarayanan (Satya computing, (2) mobility and sensing devices, (3) as he’s popularly known) has made pioneering small, dispersed data centers (i.e., cloudlets), and contributions to advance edge computing, mobile (4) continuously reporting batteryless computing, distributed systems, and the Internet sensing platforms. of Things (IoT) research focused on performance, scalability, availability, and trust challenges in computing systems. Five cloud-edge knowledge bytes We examine the present and future of distributed edge-cloud architectures in an exclusive discussion Our discussion with Satya delivers insights on five with Satya; David Linthicum, chief cloud cloud-edge trends and how to stay ahead of them. strategy officer, Deloitte Consulting LLP; and • Evolution: Boundaries are blurring across the Myke Miller, dean, Deloitte Cloud Institute. four technical computing architecture tiers Diana Kearns-Manolatos from the Deloitte (mobility and sensing devices, edge, cloud, and Center for Integrated Research moderated this continuously reporting sensing platforms) as discussion. But first, a quick foundation on edge cloud and edge technologies evolve. computing and five key insights from Organizations may need configuration the discussion. management and orchestration systems to manage and move across tiers automatically and dynamically. On the edge • The intelligent edge: Companies are increasingly integrating artificial intelligence In 2009, Satya’s research introduced the seminal (AI) and machine learning (ML) into edge use 2 concept of “cloudlets” (a “cloud” close by the cases, creating new capital and data device). His 2017 article, “The emergence of edge orchestration requirements. A well-architected computing,”3 expounded on the idea and argued four-tier computing architecture model that mobile and proximity computing devices (e.g., mobile, edge, cloud, platforms) can unlock (e.g., IoT) demand dispersed computing and a new the potential for enhanced data security/ “edge computing” paradigm—cloudlets could be privacy/trust, improved latency, and expanded placed at the internet’s edge for high-speed bandwidth. computing, scalable edge analytics, privacy • Implementation challenges and mediation, and failover in the case of a cloud solutions: Smart factories, utilities outage. Over the past decade, his research, industry management, and connected vehicles product developments, and business use case demonstrate the potential and complexity of implementations4 have adopted and adapted the mobile, edge, cloud, and continuously reporting cloudlet into what we have come to understand as sensing platforms. Organizations can partition modern-day “edge computing.” data and workloads across architecture tiers. This should be done based on current 2 The edge of cloud: A discussion with Mahadev Satyanarayanan, the Carnegie Group University professor of computer science functional data/workload needs with the centers concentrated in remote places. flexibility to adjust to future functional However, fundamental challenges involving the requirements and dynamic resource allocation. speed of transmission, round-trip times, and • Cross-tier optimization: A strong edge bandwidth remain. business case considers ecosystem computing needs to determine what belongs on premise, on mobile, on The next chapter in computing is the edge, in the cloud, and across sensing platforms. As such, going to be about the creation of organizations will pay a premium for proximity, latency, bandwidth, this edge computing infrastructure performance, scalability, and other worldwide to enable brand-new, technical benefits of edge computing. However, one must factor in human edge-native applications. costs to manage the solution while calculating the break-even point. The time is ripe for the next inflection point—could we get the kind of compute that cloud computing is • Innovation: Telecommunications and technology organizations continue to introduce able to give but deliver it without those limitations new cloud, edge, and networking solutions of latency and bandwidth? That’s what edge while businesses explore edge-native computing is about. Once you think about it this applications. This innovation further blurs the way, it’s clear why it’s the next step in the evolution. boundaries across the four computing architecture tiers. Harness the innovation The next chapter in computing is going to be about potential with business cases that require the creation of this edge computing infrastructure proximity, bandwidth, and latency around a worldwide to enable brand-new, edge-native mobile/platform sensing ecosystem. applications you simply couldn’t create if you only had to rely on the cloud. What is going to be very The following is an edited and condensed exciting is how new edge-native applications are transcript of the discussion with Satya, David, and going to bring productivity enhancements to Myke on January 19, 2021, that was aligned to the industries that have not yet been transformed. For five cloud-edge themes. example, there’s a whole class of industrial troubleshooting applications, which we can talk more about later. Evolution: To the edge and back Diana: In some of your prior research, you’ve talked about blurring the boundaries between tiers. Diana Kearns-Manolatos: As a technology How do you see the mobile edge, intelligent edge, pioneer and innovator who has been at the and cloud tiers functioning to create innovative forefront of distributed computing for the last computing architectures? three decades, what are you seeing next for the edge of cloud? Satya: Mobility is a big driver of modern computing and, by its very nature, imposes Satya: We’re roughly a decade from when cloud demands on computing. First, the phone has to fit computing started. Then, the cloud was about in your pocket, the virtual reality (VR) glasses on driving economies of scale to their limit, and we your face, or the wristwatch on your wrist. The largely achieved that vision with gigantic data device has to be small and light; it cannot get too 3 The edge of cloud: A discussion with Mahadev Satyanarayanan, the Carnegie Group University professor of computer science hot; and it has to have decent battery life. Let’s call Companies should think about what should run on these fundamental challenges to mobility the the mobile device, what should run on a cloudlet, “mobility penalty.” It is the price you pay for making and maybe what should run on the cloud. things mobile. Nonetheless, these decisions should not be viewed as rigid decisions that are architected permanently My next challenge is: How do I get cloud into the system, but as decisions that you’re computing’s power and capabilities at the extreme making for here and now. The next generation of mobile edge to overcome latency (distance) and IT hardware and IT technology might actually bandwidth limitations? Edge computing emerges require you to rethink that partition. as a solution. It uses what we’ve learned in cloud computing—managed infrastructure, multitenancy, Dave Linthicum: The big thing that I see virtualization, virtual machines, and containers— evolving is the intelligent edge and the ability to but instead of putting the data into an enormous, put AI engines in edge-based architectures. A exascale data center, it puts it closer to the mobile couple of years ago, we worked on a connected bike devices in a much smaller data center (a data concept where a hardware device served as a center in a box!), a “cloudlet” that’s really a small knowledge engine communicating to the cloud.8 cloud close by. These tiered systems can support big data analysis on the back-end cloud systems and more transactional knowledge, such as the ability to shut The intelligent edge the bike off if it was on fire, at the edge. We are partitioning the “brain,” but each section has Diana: What advice do you have for organizations access to other sections. Things are done faster, but looking to bring AI to the edge of cloud? each section is still able to operate in an independent mode. Satya (laughs): I’m laughing because I just finished writing a paper6 on exactly this topic. So, a The big thing that I see few tips: evolving is the intelligent First, the sheer speed with which things are edge and the ability to put changing in AI is dramatic. You have to pay attention to the algorithms, but by the time you use AI engines in edge-based them, newer algorithms appear. You may have to slightly change how those models work in architectures. your system. This was a more advanced organization. Most companies are still figuring out what an AI engine Second, because of this rapidly changing frontier, is, how to leverage the training data,