Real-time Insights and Decision Making using Hybrid Streaming, In-Memory Computing Analytics and Transaction Processing

Applying Gartner’s Hybrid Transaction/Analytical Processing Architecture to Cybersecurity

1 Welcome Welcome

3 Businesses are processing exponentially more The financial services industry has been Hybrid Transaction/Analytical data from clicks, swipes, micropayments, cyber successfully dealing with similar challenges for Processing Will packets, social feeds and meter readings. The over two decades. Banks and trading firms have Foster Opportunities for Dramatic volume of streaming data is overwhelming coped with steadily increasing data volumes Business Innovation IT systems, as a result many businesses are over that period using a simple scalable data “blind and deaf” for minutes to hours at a time. architecture composed of a real-time 8 Traditional data solutions are struggling to (RDB) and an historical database (HDB). Their Proven Performance and Flexibility – Kx HTAP Solution update, organize and index their data so it can approach is based on a hybrid of streaming, Cybersecurity Use Case be queried in real-time to see the true state of in-memory computing (IMC) RDBs and persistent their business. HDBs to provide real-time services to traders, 9 regulators and risk managers. Analyst for Kx Today’s businesses need fast interactive access to tens or hundreds of terabytes of data extracted Analysts in other industries like energy, telecom from huge data lakes and legacy systems. They operations systems support (OSS) and cybersecurity also need to combine multiple data sources, also need to maintain a total view of the current and both in-flight and historical data, to dynamically past state of their grids and networks. aggregate their data along different dimensions and visualize it in both tabular and graphical One of the major benefits of this architecture is form. In particular, analysts at data driven that it is easy to scale up or out, simply by adding businesses must be able to “live in their data,” additional RDBs and HDBs. Multiple RDBs simultaneously interacting with both historical and HDBs can be provisioned for specific time and real-time streaming data. periods, types of transactions, etc. In financial services it is common to find very sophisticated data architectures that combine multiple instances of streaming with complex event processing (CEP), RDBs and HDBs with publish/subscribe capabilities, and with today’s memory solutions enables businesses to query all their data business processing using micro-services. in near real-time providing a consistent picture directly assembled from the raw transactional data as required. The in-memory RDB addresses real-time business needs while the HDB is an immutable record of all past transactions. By querying both the RDB This widely used RDB/HDB architecture has lately been re-discovered by and the HDB one can always get a consistent view of the world and the the community. Variations on the model have been described as state of the business. Increasingly, RDBs are being placed in huge non- Lambda Architecture, Gartner’s Hybrid Transaction/Analytical Processing volatile memory and HDBs are being stored on fast SSDs. Working smartly (HTAP) 1, Forrester’s Translytical DB2 and Event Sourcing. We feel the Gartner report below provides valuable guidance on how to realize business value using HTAP.

1 Gartner, Predicts 2016: In-Memory Computing-Enabled Hybrid Transaction/Analytical Processing Supports Dramatic Digital Business Innovation by Fabrizio Biscotti, Massimo Pezzini, Nigel Rayner, Joseph Unsworth, Roxane Edjlali, Susan Tan, Errol Rasit, Andrew Norwood, Andrew Butler, W. Roy Schulte ; 17 December 2015 2 Forrester, Emerging Technology: Translytical Deliver Analytics At the Speed of Transactions by Noel Yuhanna and Mike Gualtieri; December 10, 2015

Real-time Insights and Decision Making using Hybrid Streaming, In-Memory Computing Analytics and Transaction Processing is published by KX. Editorial supplied by KX is independent of Gartner analysis. All Gartner research is © 2016 by Gartner, Inc. All rights reserved. All Gartner materials are used with Gartner’s permission. The use or publication of Gartner research does not indicate Gartner’s endorsement of KX’s products and/ or strategies. Reproduction or distribution of this publication in any form without prior written permission is forbidden. The information contained herein has been obtained from sources believed to be reliable. Gartner disclaims all warranties as to the accuracy, completeness or adequacy of such information. Gartner shall have no liability for errors, omissions or inadequacies in the information contained herein or for interpretations thereof. The opinions expressed herein are subject to change without notice. Although Gartner research may include a discussion of related legal issues, Gartner does not provide legal advice or services and its research should not be construed or used as such. Gartner is a public company, and its shareholders may include firms and funds that have financial interests in entities covered in Gartner research. Gartner’s Board of Directors may include senior managers of these firms or funds. Gartner research is produced independently by its research organization without input or influence from these firms, funds or their managers. For further information on the independence and integrity of Gartner research, see “Guiding Principles on Independence and Objectivity” on its website, http://www.gartner.com/technology/about/ombudsman/omb_guide2.jsp.

2 Research from Gartner: 3 Hybrid Transaction/Analytical Processing Will Foster Opportunities for Dramatic Business Innovation

Hybrid Transaction/Analytical Processing • Plan for coexistence and interoperability Analytical processing systems (see Note 2) Will Foster Opportunities for Dramatic of IMC and traditional technology, mixing were created separately to optimize data Business Innovation products from different vendors, if needed. structures for massive queries, as well as to avoid access concurrency and the Hybrid transaction/analytical processing will • Revisit your information management associated performance and empower application leaders to innovate via strategy, including governance, SLAs impact of analytical queries over business- greater situation awareness and improved and life cycle management, to ensure critical transactional systems. Such a business agility. This will entail an upheaval proper organizational alignment and separation also allowed the combining of in the established architectures, technologies ownership of the information in support data from multiple sources into analytical and skills driven by use of in-memory of using HTAP architectures along with databases to measure enterprise computing technologies as enablers. traditional approaches. performance across applications.

Impacts Analysis Therefore, transaction processing and • The emergence of HTAP means IT leaders The notion of running transactions and analytical systems are usually based must identify the value of advanced real- analytics on the same database of record on distinct architectures, thus adding time analytics, and where and how these has been around since the early days of complexity to the information architecture enable process innovation. computing, but has not fully materialized so and relevant infrastructure and introducing far because of a variety of issues, including delay in data analysis (see Figure 1). • By eliminating analytic latency and data technology limitations. synchronization issues, HTAP will enable Because of technology advances such IT leaders to simplify their information Transaction processing systems (see Note 1) as in-memory computing (IMC), hybrid management infrastructure, if they can are usually supported by operational DBMSs transaction/analytical processing (HTAP) overcome the challenges of adopting this (formerly referred to as online transaction architectures, which enable applications new approach. processing [OLTP] DBMSs; storing data in to analyze “live” data as it is created structures optimized for fast access to single and updated by transaction-processing • Technology immaturity and established data items (“rows”). functions, are now realistic and possible. application environment value and Although not every business process equally FIGURE 1 complexity will force IT leaders to plan for Traditional Transaction and Analytical Processing long-term coexistence between HTAP and traditional approaches.

Recommendations • Educate business leaders about HTAP and IMC concepts, and why they are important. Brainstorm with them to identify concrete opportunities to rethink business processes.

• Discuss with your strategic information management and business application providers their vision, road map and technology for HTAP implementation in their products.

• Pilot the use of HTAP architectures in individual “system of innovation” projects.

• Balance the costs of HTAP adoption (hardware and software infrastructure, migration, operational processes, and skills) with the anticipated business and IT benefits. Source: Gartner (January 2014)

3 benefits by such architectures (for example, Gartner believes less than 1% of user architecture, information management and in certain cases latency is characteristic of organizations have implemented HTAP- business analytic strategies. the process itself and not just an induced based applications, and expects this negative effect stemming from technology architecture will take five to 10 years to reach Impacts and Recommendations limitations), HTAP will, in many cases, enable mainstream adoption. Nevertheless: The emergence of HTAP means radical improvements. IT leaders must identify the value • Some user organizations have already Although it is possible to build simple forms leveraged IMC-enabled HTAP to support of advanced real-time analytics, of HTAP applications using traditional transformative business process innovation. and where and how these enable DBMSs, Gartner believes that most HTAP process innovation implementations should and will be IMC- • Some packaged application vendors and HTAP could potentially redefine the way some enabled. IMC technologies, such as in- SaaS providers are already exploiting business processes are executed, as real-time memory DBMSs (IMDBMSs) and in-memory HTAP architectures in their offerings. advanced analytics (for example, planning, data grids (IMDGs), support a single, low- Although these are in their early stages forecasting and what-if analysis) becomes an latency-access, in-memory data store that and adoption is often limited, vendor integral part of the process itself, rather than can process high volumes of transactions. marketing efforts will soon bring this a separate activity performed after the fact. These technologies can also support zero- approach to business users’ attention. This would enable new forms of real-time latency analytics on that same data, including business-driven decision-making processes advanced analytics, such as forecasting and To support their organizations’ business (see Note 3 for some examples). Ultimately, simulations, as well as more traditional styles strategy, understanding the fundamental HTAP will become a key enabling architecture of descriptive analysis (see Figure 2). tenets of HTAP and defining a relevant for intelligent business operations. strategy will be an imperative for IT leaders involved in the definition of application HTAP will enable business leaders to perform, in the context of operational processes, much more advanced and FIGURE 2 sophisticated real-time analysis of their IMC-Enabled Hybrid Transaction/Analytical Processing business data than with traditional architectures. Large volumes of complex business data can be analyzed in real time using intuitive data exploration and analysis without the latency of offloading the data to a or data warehouse. This will allow business users to make more informed operational and tactical decisions.

Running more complex diagnostic and predictive analytics in real time (and leveraging very granular transactional data) on the database of record will allow decision makers to be alerted to business trends and situations that may require their immediate attention. HTAP architectures accomplish this by improving business leaders’ situation awareness in operations (for example, for applications such as risk management and fraud detection), and by providing constantly updated forecasts and simulations of future business outcomes.

Source: Gartner (January 2014)

4 5

FIGURE 3 tools in the hands of the business users to Impacts and Top Recommendations for HTAP for Business Innovation perform their own analytics, without needing IT support and without creating separate, stand-alone “spreadmarts” to perform those analytics that IT is not providing.

Recommendations:

• Educate business users about the concepts of HTAP and IMC, and why these are important new approaches using simple metaphors (see Note 4).

• Brainstorm with business leaders to identify opportunities to rethink business processes, from an HTAP perspective, to support system-of-innovation initiatives via greater situation awareness, advanced real-time analytics and business-goal-driven decision making.

• Evaluate native HTAP-packaged/SaaS applications targeting process innovation, Source: Gartner (January 2014) even those from small vendors of uncertain viability. Don’t wait for the megavendors to deliver if you aim for a However, figuring out how HTAP can help department) interact with data. With these competitive advantage. process innovation is challenging. Business tools, business users can explore data and leaders need to think outside of the box discover new insights without any help from to determine how this architecture could IT. The data used is often the result of a data By eliminating analytic latency and transform processes rather than just provide mashup created by business users outside data synchronization issues, HTAP existing styles of analytics faster and without any governance process. As a result, such will enable IT leaders to simplify latency. But application architects and a mashup adds yet one more copy of the their information management developers (and also system integrators transactional applications’ data. With HTAP, infrastructure, if they can overcome and packaged/SaaS application vendors) transactional data is directly available for the challenges of adopting this have little or no experience with how to discovery and creating copies is not needed. new approach design, implement, deploy and operate HTAP applications. Hence, it may be difficult This incremental approach to HTAP favors HTAP will help simplify organizations’ for them to assess whether the innovation adoption of a rapid and agile method to new information management infrastructure pushed by business leaders is really possible application function delivery. It is not necessary to by eliminating, or at least reducing, the to implement. develop detailed requirements or to implement need to duplicate data and to keep data all the analytical features before putting the consistent. HTAP architectures also imply However, IT leaders don’t necessarily new system in production. In many cases, it is a redefinition of the role of traditional data need to adopt a big-bang approach to possible to initially implement the core set of warehouse architectures. HTAP. Use of IMC technologies provides capabilities, incrementally add new analytics, opportunities for user organizations to adopt and then let end users develop or customize (by However, the degree of simplification HTAP architectures incrementally, through themselves) the features they deem necessary. enabled by HTAP will vary. By not requiring a trial-and-error approach. For example, a data mart or data warehouse, HTAP IM analytics have enabled data discovery Ultimately, HTAP architectures will do for could make it simpler to meet reporting and capabilities, allowing business users to freely business analytics what data visualization tools analytical needs based on operational data (that is, without much support from the IT have done for query and analysis: put the right from a single instance of an application

5 stack (such as an ERP suite). A small number The transactional data of HTAP if the vendor didn’t go through the process of organizations are already realizing these applications is readily available for of optimizing its product code for IMC and benefits. However, HTAP will not have such analytics when created. adding real-time analytics. a beneficial impact on large and complex organizations that need to aggregate • Synchronization. If analytical and • In-memory data management data from multiple sources (for example, transactional data storage is separated, technologies (IMDBMS and IMDG) are not SaaS applications, cloud information when business users want to “drill fully proven in support of HTAP use cases. services, social networks and on-premises down” from a point-in-time aggregate applications). However, even in these into the details of the source data in the • Data governance policies and processes organizations, data marts created just to operational database, in many cases will have to be revised to cover in-memory support analytical and reporting needs from they find the source of data “out of synch” data. For example, data quality checks a single application could be eliminated, because of the analytic latency. need to happen in the application itself, which will simplify some of the information rather than when data is being moved into management landscape. In HTAP, drill-down from analytic the data warehouse. aggregates always points to the “fresh” By definition, at least in the context of a HTAP application data. • A combination of data sources in the single application stack, HTAP architectures HTAP in-memory data store may be address the four major drawbacks of • Data duplication. In a traditional required to perform advanced analytics. traditional approaches: architecture, multiple copies of the same This is a challenge in its own right, but it is data must be administered, monitored, further exacerbated by the fact that data • Architectural and technical complexity. managed and kept consistent, which may integration tools that feed in-memory data In traditional approaches, data must lead to inaccuracies, timing differences stores from other data sources are still be extracted from the operational and inconsistency. few and are not fully proven in real-life database, transformed and loaded into production deployments. the analytical database, which requires In HTAP, the need to create multiple the adoption of database replication; copies of the same data is eliminated (or Therefore, HTAP adoption may require extraction, transformation and loading at least reduced). significant costs and risks, which would (ETL) tools; enterprise service buses make it hard to justify only on the basis of (ESBs); message-oriented middleware However, adopting HTAP has its challenges: information and application infrastructure cost (MOM); and other integration tools. The concept is immature, industry experience reductions, although these considerations is still limited to the most-leading-edge should be factored into the business case for In HTAP, data doesn’t need to move organizations in a few industry sectors HTAP-based application projects. from operational databases to (primarily financial services), best practices separated data warehouses/data are not yet crystallized, the vendors’ Recommendations: marts to support analytics. landscape is still quite turbulent, and relevant skills are almost impossible to find. • Discuss with your strategic data • Analytic latency. In a classic setting, management and business application it can take hours, days or even weeks IT leaders also will face the issues providers their vision, road map and from the moment data is generated by associated with the use of IMC technology, technology for HTAP implementation in the transaction processing application which is a key enabler for HTAP initiatives: their products. to when it can be used for analytics. Although this is adequate for certain • Often, the code of traditional applications • Experiment with IMC technologies to types of analytics, and even processes, must be re-engineered to take maximum assess the suitability of these products for it may be suboptimal for others. For advantage of IMC technologies and your HTAP requirements. example, being able to perform financial to enable integration with advanced consolidation at any point in the month analytics tools. • Pilot the use of HTAP architectures in can enable a CFO to better evaluate the individual, high-risk/high-reward system- business impact of economic trends and • Migrating to the IMC-enabled version of a of-innovation projects (as opposed to take early corrective actions. traditional packaged application doesn’t enforcing HTAP as an architectural principle necessarily lead to an HTAP architecture across the board). This will help identify challenges that will need to be addressed

6 7

once the high-impact potential use cases Recommendations: • Address organizations’ needs to monitor have been identified as candidates for a the business, measure performance and more extensive deployment. • Plan for coexistence and interoperability of identify trends by combining multiple traditional and HTAP-enabling technologies, sources of data from multiple applications. • Evaluate the costs of HTAP adoption possibly procuring products from different (skills acquisition, application and vendors, if needed. A single-supplier Note 3 hardware infrastructure refresh, strategy is premature, as today, none can Examples of How HTAP Can Improve application migration, and operations provide a comprehensive and mature Business Processes and management process updates), portfolio of HTAP-enabling products. HTAP can improve business processes by: and balance those with the anticipated business and IT benefits. • Identify the role of HTAP in the evolution of • Deciding on how to fulfill a purchase your data warehouse toward the logical order on the basis of its impact on data warehouse. Technology immaturity and profitability or customer satisfaction. established application • Revisit your information management strategy, environment value and complexity • Using predictive analytics to inform including governance, SLA and life cycle decision makers that business and will force IT leaders to plan for management, to ensure proper organizational long-term coexistence between alignment and ownership of the information financial targets are unlikely to be met HTAP and traditional approaches in support of HTAP architectures along with by financial period end, so they can take corrective action (instead of providing Gartner believes HTAP adoption will traditional approaches. diagnostic analysis after the event). grow significantly over the next five years because of its significant business impact. Note 1 • Notifying logistics companies that Nonetheless, IT leaders should plan for a Transaction Processing Application shipments may miss their delivery coexistence of traditional and HTAP-based Systems timelines. Thus, business analysts can systems. As discussed previously in this These systems: adjust the itineraries or renegotiate research, data warehouse architectures shipping alternatives with clients in real will remain necessary to support extended • Enable fast access to data to support time via what-if analysis. analysis that involves large amounts of business processes, such as order entry, historical data/big data that comes from banking operations, travel reservation Note 4 multiple internal and external, structured and systems, e-commerce and myriad other Example of How to Sell HTAP to the unstructured data sources. Individual HTAP day-to-day activities in every industry sector. systems will be contributors to those (logical Business or physical) data warehouses, but will not • Collect, process and manage data in One CIO was able to “sell” HTAP to his fully replace them. support of critical business processes. business executives by suggesting that, while their traditional BI approach allowed Moreover: • Must support the SLAs that are defined them to instantly see what happened by the business and ensure that all yesterday, HTAP would give them real-time • In many instances, a traditional approach transactions are processed reliably. visibility into what was happening now. meets business requirements, and migrating to an HTAP architecture would Note 2 Source: Gartner Research Note, G00259033, Massimo not be justified and, often, not even Pezzini, Donald Feinberg, Nigel Rayner, Roxane Edjlali, Analytical Processing Systems 28 April 2015 desirable or possible. These systems: • Even if theoretically possible, it will be practically impossible to migrate many • Support efficient analysis of data for established applications toward HTAP due reporting, business intelligence and other to the massive organizational and technical initiatives that require the fast scanning of efforts, and because of the inertia of large databases to create data summaries packaged business application vendors. and aggregations.

7 Proven Performance and Flexibility – Kx HTAP Solution Cybersecurity Use Case

Kx’s implementation of HTAP architecture One of the challenges in deploying IMC technology which handles streaming, in- provides businesses with the means to solutions is the integration of streaming, memory and disks based databases greatly make informed business decisions based in-memory and storage based processing simplifying HTAP implementation. Unlike on analytics run over real-time, streaming which often requires the use of different many in-memory only solutions Kx technology and historical data. As Gartner says, “HTAP vendors and technologies. Another has been in use for decades to handle both will enable business leaders to perform, in challenge is the need to deal with burst of in-memory and disk-based workloads hence the context of operational processes, much traffic associated with black Friday shopping can accommodate burst requirements. more advanced and sophisticated real-time or a cyber attack which may exceed the analysis of their business data than with capacity of the in-memory DB or grid. Kx In order to meet the demanding needs of traditional architectures.” 1 addresses the first by providing a single capital markets Kx is designed for performance on modern processors with the compact FIGURE 1 database engine staying in the instruction Sample Kx - HTAP Development Environment cache, and column processing leveraging memory prefetch and vector instructions.

This means that Kx can handle more data at once and is more efficient at aggregating and analyzing data. Proven in one of the most demanding application areas, financial services, industries struggling to address increasingly complex regulatory and business decision requirements are turning to Kx‘s HTAP solution.

Source: Kx

Source: KX

1 Gartner, Hybrid Transaction/Analytical Processing Will Foster Opportunities for Dramatic Business Innovation, Massimo Pezzini | Donald Feinberg | Nigel Rayner | Roxane Edjlali, 28 April 2015

8 Analyst for Kx

The Kx for Surveillance solution provides HTAP surveillance capabilities to simultaneously at both current as well as all past traffic. Forensic the financial markets where determination of rogue activity is based as analysis on past data will often expose full or partial patterns that much on historical data (eg: transactions, news events, order patterns) often appear again in future traffic. as the immediate transaction under investigation. In addition, Analyst for Kx is a robust solution that implements Gartner’s HTAP architecture for Analyst for Kx is a product designed to help human cyber-analysts cyber-analytics and telecommunications and has been providing timely anticipate and detect new forms of attack. Part of a suite of information for enforcement activities for several years. surveillance products, Analyst for Kx provides the tools necessary to deal with ingesting, transforming and visualizing internet data. It is There are few areas where the competition is so intense, and the one of the only tools that enable experts to interactively explore huge stakes are as high as cybersecurity. Cybersecurity is a high stakes volumes of both current and past traffic. Only by thorough forensic game that is always changing, where humans augmented by analysis of historical data is one able to understand attack patterns computers continually seek new techniques to both obfuscate their and characterize normal behavior, both being essential to building attacks and mask their presence in target systems. “Both defenders and validating usage and traffic models. Kx also has significant and attackers collaborate with their respective peers to gain experience with stock market surveillance, which is a very similar competitive advantage. Both review the past to anticipate the future. cyber-crime and uses similar technical approaches. Both constantly scan software to identify potential defects maintaining an inventory for the future repairs/exploits.” In order to cope with future threats, cyber-analysts place themselves in the mindset of an attacker, creating models of potential attacks Classic cyber-defense strategies for Individuals and corporations and devising associated detectors for such attacks. These point of count on detecting known attacks and using well-known digital decision models run in the live stream of traffic setting off an alarm fingerprints to block potential threats. Experts will confidently assert when a detector identifies a potential attack. Analysts then quickly that no one is immune from attacks, and daily news stories bear verify if there is a cause for concern and if so, immediately take steps them out. Assume that you will be hacked, and plan now to focus on to counter the attack. identifying and countering attacks. As a practical matter designing and implementing model-based When a cyber-attacker gains access they often hide inside target behavioral cyber-security requires a full programming language and systems for many months, gathering information about the system’s associated development tools, together with comprehensive analysis software and hardware in order to eventually mount a more effective and visualization tools. In addition, machine learning and statistical attack. In fact, most sophisticated attacks are built up over long libraries that support a full range of statistical and deterministic periods of time, hiding innocently in an otherwise normal stream of modeling are essential. Efficient probabilistic reasoning techniques internet traffic, remaining undetected by classic cyber-approaches to deal with the realities of fuzzy/uncertain reasoning about noisy that are based on simple signatures or looking at traffic in a data are also needed. Analyst for Kx offers all of these capabilities, small sliding window. These old-style defenses are insufficient in seamlessly integrated with the proven Kx HTAP solution. today’s world. In order to detect modern attacks one needs to look

Source: Kx

9