How Software-Defined Servers Will Drive the Future of Infrastructure and Operations
Total Page:16
File Type:pdf, Size:1020Kb
March 2020 How Software-Defined Servers Will Drive the Future of Infrastructure and Operations In this issue Introduction 2 How Software-Defined Servers Will Drive the Future of Infrastructure and Operations 3 Research from Gartner Top 10 Technologies That Will Drive the Future of Infrastructure and Operations 10 About TidalScale 23 2 Introduction SIXTEEN YEARS AGO, InformationWeek published a prescient call to arms for building an intelligent IT infrastructure. “The mounting complexity of today’s IT infrastructure,” cautioned the author, is having a “draining effect…on IT resources.”1 If the need for flexible, on-demand IT infrastructure was obvious back in 2004, imagine where we find ourselves today. Businesses now run on data. They analyze it to uncover opportunities, identify efficiencies, and to define their competitive advantage. But that dependency comes with real-world challenges, particularly with data volumes doubling every year2 and IoT data growth outpacing enterprise data by 50X.3 Talk about “mounting complexity.” The “draining effect” on IT resources observed 16 years ago is hitting IT operations where they live—both in their ability to meet SLAs and in their efforts to do more within limited budgets. Legacy platforms fail to keep up with growing and unpredictable workloads. Traditional approaches to scaling force IT departments into the same old system sizing, purchasing, and deployment cycles that can last months, even years. Today’s CIOs are right to ask: If my largest servers can’t handle my SAP HANA, Oracle Database, data analytics or other in-memory workloads, can I really afford to commit a year or more to simply bringing a new system online? And does that do anything to solve my need for a more flexible, intelligent infrastructure? More and more CIOs are realizing that the answer to both questions is a resounding, “No.” This paper explores how new technologies are redefining the IT landscape—not because they’re bright and shiny new tech objects, but because they’re a crucial part of the agile, scalable and intelligent infrastructures that modern enterprises need. It describes how one acclaimed breakthrough—software-defined servers—aligns with many of the most exciting and influential technology trends that will drive the future of Infrastructure and Operations. Read this report to learn how software-defined servers turn traditionally fixed data center and cloud resources into fluid, on-demand assets that combine to create virtual servers of any size— scaling beyond physical hardware boundaries, entirely on demand and with zero OS or application modifications. Discover how enterprises rely on this disruptive, award-winning technology to reduce IT infrastructure complexity, maximizing the business value of their data while increasing application performance and driving down costs. Time is fleeting, data is exploding, and complex new problems demand better solutions. Gary Smerdon CEO TidalScale 1https://www.informationweek.com/applications/building-an-intelligent-it-infrastructure/d/d-id/1028655 2https://techjury.net/stats-about/big-data-statistics/ 3https://insidebigdata.com/2017/02/16/the-exponential-growth-of-data/ How Software-Defined Servers Will Drive the Future of Infrastructure and Operations Agile. Scalable. Intelligent. These are the defining characteristics of IT infrastructures that all enterprises must adopt if they hope to compete in a world where change is accelerating. Too many organizations are weighed down by applications, processes and hardware designed for a more predictable age. And as more applications and services move to cloud and open-source solutions, where does that migration leave IT environments with significant investments in legacy hardware and technology? Large-scale hardware solution vendors believe they have an answer. They’re introducing composable (or converged, or hyperconverged) architectures built around their own proprietary platforms. Some are further along than others, but they all share a similar goal: to solidify the grip on IT environments that they’ve enjoyed for years, and sometimes decades. The Future Is in Breakthrough Technologies New technologies, however, promise to disrupt the world of vendor lock-in and proprietary solutions. Analysts view them as the key to modernizing IT. In Top 10 Technologies That Will Drive the Future of Infrastructure and Operations (Gartner: Arun Chandrasekaran and Andrew Lerner, October 29, 2019), a Gartner research report featured later in this paper, the authors short-list a range of emerging technologies and offerings that they expect will help create the I&O environments CIOs are seeking. 4 Figure 1. Top 10 Technologies That Will Have the Greatest Impact on Infrastructure and Operations1 Source: Gartner (October 2019) These are all worthy choices for a top 10 list, even Software-Defined Servers and the Modern if many are still in the proof-of-concept stage. Datacenter Far from complete solutions, they aim to solve a One breakthrough in particular adds value to the specific problem or set of problems. This makes aspects of I&O that nearly all of these technologies sense, because the task of modernizing IT requires address. That breakthrough is TidalScale software- more than a single solution. It requires an array of defined server technology. TidalScale software technologies that work together to create a fluid, on- combines multiple commodity servers into virtual demand environment. And the more they use industry- systems of any size. TidalScale software virtualizes all standard technologies, the better. resources (cores, memory and IO) resident in those servers and combines them to create a software- 1 Top 10 Technologies That Will Drive the Future of Infrastructure and Operations, 29 October 2019, G00430091, Arun Chandrasekaran, Andrew Lerner 5 defined server. This virtual server appears as a single where the application needs it most. The system also system to the OS and application software. monitors and learns from its performance, enabling the TidalScale platform to optimize itself over time. By creating software-defined servers on demand, TidalScale does for rigid, inflexible servers what Because TidalScale software works with any software-defined solutions have done for storage application without requiring a single software and networking. The result is modern, agile and modification, TidalScale’s ML capabilities are cost-effective IT infrastructure. (See How TidalScale complimentary to—and work seamlessly with—AI that Modernizes IT on page 8.) exists at the application level. TidalScale and Gartner’s Top Tech TREND: Container Management (Orchestration) Trends The use of containers is growing so popular that How software-defined server technology impacts Gartner predicts that by 2022, 75% of global Gartner’s chosen technologies reveals its broad organizations will run containerized applications in application throughout a modern datacenter. production. This makes orchestration software an attractive offering. But even as containers grow in TREND: Artificial Intelligence for IT Operations popularity, challenges exist in three primary areas: (AIOps) Platforms ■ Mobility (live migration of executing processes is AI is becoming a major force throughout all of computing, difficult or impossible) but the focus of most AIOps implementations is on applications embedded with AI. The goal is to achieve ■ Orchestration (adding and removing containers greater utilization at the application level. This is valuable, while servers are running can increase latency) to be sure, but it misses an opportunity to infuse AI capabilities in a way that’s arguably more foundational. ■ Security (most environments lack hardware- enforced separation between multiple containers WHERE TIDALSCALE FITS: TidalScale has taken that running on the same server) foundational approach by integrating machine learning (ML)—a key enabling component of AI—at the server WHERE TIDALSCALE FITS: TidalScale works (rather than application) level. To optimize application seamlessly as part of containerized and DevOps performance, TidalScale uses ML to automatically environments. In addition it addresses the three main locate cores, memory and I/O where they will deliver challenges of containers: the optimal performance for the application in use. For instance, if the application is memory-intensive, then ■ TidalScale solves container mobility challenges the ML algorithms in a TidalScale software-defined by utilizing the hardware below the OS kernel to server migrate the system’s memory in real time to 6 transparently move resources across nodes to NCS Analytics, a wherever they’re needed, including cores, memory, Colorado-based provider and networking resources. All of these resources of data analytics solutions can be configured and mobilized as needed. for governments and financial institutions serving high- risk industries, relies on TidalScale for its own DevOps ■ Container orchestration is simpler because environment. Analysts using R, Python and other tools TidalScale’s control panel allows for point-and- create software-defined servers on demand—servers click provisioning of physical resources at the rack large enough for them to accelerate the processing of level. For instance, the ability to pool all server development models by as much as 240X compared to resources in a rack while maintaining locality local or traditional cloud infrastructure servers. They makes it easier to orchestrate storage, CPU, are also able to iterate models 25X more often than