Next reports Reports.InformationWeek.com July 2012 $99 IT Pro Impact: In-Memory Analytics and It’s taken six years or so of effort, but vendors have finally aligned the technology and the economics to bring capabilities once limited to telcos and trading floors to companies looking to maximize the value of their big data. The falling prices of DRAM make the business case that much stronger and mean that DBMS architects can, finally, dare to think beyond disk. We’ll discuss the state of the market and connect platform choices with real-world deployment scenarios.

By Sreedhar Kajeepeta

Report ID: S5330712 Previous Next

reports IT Pro Impact: In-Memory Analytics and Databases

3 Author’s Bio 4 Executive Summary

S 5 In Memory, In Vogue 5 Figure 1: Big Data Concerns ABOUT US 6 The Business Case T 6 Figure 2: Physical Storage of Big Data InformationWeek Reports’ analysts arm 7 Figure 3: The In-Memory Advantage business technology decision-makers 8 Technical Advances with real-world perspective based on 9 So Where Does It Fit? qualitative and quantitative research, N 9 Figure 4: Where Does In-Memory Fit? business and technology assessment 10 Implementation Considerations and planning tools, and adoption best 11 Figure 5: Vendors and Their Offerings practices gleaned from experience. To E 12 The Players, the Hurdles contact us, write to managing director 13 Figure 6: Barriers to Successful Big Data Art Wittmann at [email protected],

T Management content director Lorna Garey at 14 What the Future Holds [email protected], editor-at-large Andrew 15 Related Reports Conry-Murray at [email protected], and research managing editor Heather Vallis at

N [email protected]. Find all of our reports at reports.informationweek.com . TABLE OF O reports.informationweek.com July 2012 2 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

Sreedhar Kajeepeta is global VP and CTO of technology consulting for GBS at CSC. CSC’s consulting groups across North and South America, Europe, Asia and Australia specialize in , social networking for the enterprise, Sreedhar Kajeepeta SOA, enterprise transformation, big data, data warehousing and business intelli - InformationWeek Reports gence/analytics, enterprise mobility, cybersecurity and application consulting (open source, JEE and .NET). Sreedhar is based in Farmington Hills, Mich., and can be reached at [email protected] .

Want More? Never Miss a Report!

FFoollolow w FFoollolow w

reports.informationweek.com © 2012 InformationWeek, Reproduction Prohibited July 2012 3 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

If your IT group has lost its innovation mojo, focusing on the intersection of big data and in-memory analytics and databases is a good bet to get it back. Serious changes are Y on the horizon: Databases and data warehouses as we know them, with their spinning disks and related I/O overhead, are not the future. Big data analytics and the management systems supporting next-generation, low-latency transactional systems

R demand a new in-memory approach. But questions remain about cost, use cases and synergies with existing technologies such as Hadoop, NoSQL, caching, stream computing and grids. Here’s our take. A M M

EXECUTIVE U S reports.informationweek.com July 2012 4 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

In Memory, In Vogue Figure 1 Even with the pitched battle SAP and Oracle have been waging over Hana and Exalytics, Big Data Concerns How concerned are you with the following issues as they relate to big data? Please use a scale of 1 to 5, the numbers are fairly impressive: SAP’s Hana where 1 is “not at all concerned” and 5 is “extremely concerned.” ended 2011 60% above the company’s sales 1 Not at all concerned Extremely concerned 5 goals, and last month IDC reported that SAP Speed of accessibility had become the fastest-growing DBMS ven - 3.9 dor thanks to its acquisition of Sybase ASE Long-term storage (PDF) and continued development of Hana. 3.7 Oracle Exalytics, meanwhile, became generally Analytics 3.6 available in late February and is seen as con - Note: Mean average ratings R4030212/6 tributing to the rising hardware margins (to Data: InformationWeek 2012 Big Data Survey of 231 business technology professionals, December 2011 the tune of 51%) Oracle reported in its June 18 earnings call. big data that’s piling up. Wh en it comes to processes can move. Typical average response The broader high-performance computing business agility, every millisecond matters: times for conventional relational database market is expected to reach $220 billion by Our InformationWeek 2012 Big Data Survey of management systems have been measured in 2020, according to a new study by Market Re - 231 business technology professionals, all seconds for online transactions and in hours search Media, which pegs in-memory com - from organizations managing a minimum of for batch processing. Disk I/O is the weakest puting as one of the fastest-growing compo - 10 TB of data, shows the No. 1 area of concern link in IT’s efforts to reduce latency in high- nents of high-performance computing. is speed of accessibility (Figure 1). Yet for phys - speed analytics and transactional applications Why? And why now? Because what has so ical storage, 85% are using disk ( Figure 2 ). where durability can be relaxed for the sake far been a niche and expensive form of data - Let’s be clear: Disk-based databases, with of achieving the low response times so critical base technology is poised to go mainstream, their high-latency-ridden I/O bottlenecks, to the business. just in time to help businesses put to use the place a severe constraint on how fast business In some similarly demanding areas we’ve reports.informationweek.com July 2012 5 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

been able to reduce delays with distributed Figure 2 caching systems, using Memcached or Oracle Physical Storage of Big Data Coherence, for example, to create relatively What is your organization’s current approach for physical storage of big data? lower-latency transactional systems. Even Disk with distributed caching, however, the persist - 85% ence layer is a disk-based database from Tape which a slice of anticipated records is cached 46% into real memory to speed up queries. Up - Optical 23% dates must still be written to disk; thus, most Solid-state drives distributed caching systems offer three- or 20% four-second response times. Other Not bad, but not what we need to mine big 2%

data stores in near real time. Note: Multiple responses allowed R4030212/4 Research: The Big Data Data: InformationWeek 2012 Big Data Survey of 231 business technology professionals, December 2011 Management Challenge In contrast, with in-memory databases, disk I/O bottlenecks and related CPU-intensive ac - The challenge of big data is real, but most organizations don't dif - tivities—indexing; hashing, used for efficient throughput reaching hundred s of thousands tion. Hana, for example, was used by Charité ferentiate 'big data' from tradi - indexing; list management for resolving logi - of transactions per second are the hallmarks Universitätsmedizin Berlin, a large university tional data, and nearly 90% of respondents to our survey use cal and physical locations of data on disk; of in-memory transactional systems. hospital, to develop an iPad application able conventional databases as the cache management; and all related inter - to analyze 3 million records for 140,000 pa - primary means of handling data. process communication activities for the The Business Case tients and provide, in less than one second, We'll help you understand what constitutes big data (it's not just many hand-offs involved—are either elimi - In-memory databases have been around for answers that once took days or weeks. United size) and the numerous manage - nated or moved to DRAM ( Figure 3 ). As a re - 30-plus years—remember IBM’s IMS/VS Fast Airlines used Oracle’s TimesTen to move data ment challenges it poses for enterprises. sult, an application’s address space can com - Path, circa 1978? But only now is the align - from legacy systems to an operational data municate directly with the database, which is ment of business need, technology, econom - store to do fast “what if” scenarios for predict - Download now in RAM. Subsecond response times and ics and scalability driving mainstream adop - ing and heading off potential scheduling, reports.informationweek.com July 2012 6 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

weather and staffing problems. BAE Systems Figure 3 chose McObject’s eXtremeDB for in-memory TheThe InIn-MeIn-MemoryMemory AAdvantagedvantage embedded databases running on Wind River’s VxWorks real-time as Traditional Database part of an avionics upgrade for the high-pro - Management System In-Memory Databases file Panavia Tornado GR4 multirole combat jet. App App Sixty percent of respondents to our Big Data

Survey are somewhat or very likely to invest SQL Query SQL or API Query in technologies to manage big data initiatives engine optimizer engine optimizer within the next year. The goal here is to organ - Disk address Memory App data ize data into a fabric that can be searched, table# page# address browsed, navigated, analyzed and visualized Buffered Hash In- memory data functions database while adding standardization and scalability. Buffer pools Linked NoSQL plays a complementary role here, as lists we discuss in a recent report . Perhaps IT could Disk I/O do high-scale/low-cost processing on Hadoop, then employ an in-memory system to analyze the winnowed-down result sets. To that extent, Database these technologies attack different ends of the data life cycle problem: Hana, for example, could provide the high-speed analytics for a next-gen ETL process built with Hadoop; we In-memoryIn-memory technologytechnoloechnology enables applicationsapplicattionsions ttoo ccommunicateommunicate dirdirectlyectlytly with the database,database, which is nownow in RAM,RAM, go into much more depth on the role of slashingslashing responserespesponse times.times. Optionally,Optionallyy,, datadata couldcooulduld bebe writtenwritten ttoo disk forfoorr fault ttolerance.olerance. In contrast,contrast, a conventionalconventional Hadoop in big data in this report . databasedatabasetabase must navigatenavigate disk addresses,addresses, bufferbuffffere poolspools and,and, perhapsperhapshaps most problematic,problematic, I/O limitations.limitations. For now, the business case for in-memory reports.informationweek.com July 2012 7 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

systems can be summed up as the growing a scenario dependent on high-volume con - four 10-core processors, while SAP Hana has use of big data and the changing role of data tent; and been demonstrated with 4,000 cores. McOb - warehouses: Volume, velocity and variety for > Statistical models are making an impact ject’s eXtreme has run on a test machine with big data initiatives are increasing rapidly. in more departments, driving demand for 80 dual-core processors. While not main - While the amount of transactional data con - high-performance analytics integrated with stream, these setups prove the levels of verti - tinues to grow, the utility of historical data is transactional data. cal scalability that can be achieved, given the expanding and with it the need for forecast - Fortunately, vendors and open source projects expertise and budget. ing through hypotheses and modeling. These are looking to help IT meet these demands— Compression algorithms have been getting initiatives are gaining a higher level of visibil - and, of course, enable us to mine big data. smarter as well, and low latency at the data - ity, and executives are demanding faster op - base layer is now coupled with such high- erational results. Meanwhile, technology ad - Technical Advances speed networking options as 10 Gbps Ether - vances are creating the The good news for IT: DRAM prices have net LANs and InfiniBand (40 Gbps, for Oracle opportunity for more dropped significantly, to about $15 per giga - Exalytics) server connections . Merely converting a disk-based data correlation. byte at publication time, with no bottom in Visualization tools on the edge make it eas - RDBMS to work with a RAM disk Concurrently, the pur - sight yet. As a result, a full terabyte of DRAM ier for business users and IT operations to not doesn’t automatically make it pose and integration pro - is very typical with in-memory systems. And just consume targeted information but be - an in-memory database. file of an enterprise data newer DRAM technologies, such as load-re - come actively engaged with the dynamics of warehouse is changing in duced DIMM, are offering greater perform - business from the business intelligence/busi - three main ways: ance than the current widely used RDIMM ness analytics layer to take real-time action. > Business processes are being aligned The popularity of 64-bit operating systems A key point to remember when considering with, or integrated within, the database, en - is on the rise; these operating systems provide in-memory systems is that there’s more to abling event-based triggering—and increas - the ability to address terabytes of memory this technology than just eliminating I/O bot - ing the demand for low latency; while avoiding swap-in and swap-out over - tlenecks related to disk handling, as under - > Causal analysis is driving more forward head. Multicore CPU architectures also speed scored earlier by the list of CPU-heavy over - (leading) indicators from the data warehouse, up processing: Oracle Exalytics can come with head involved. Hence, merely converting a reports.informationweek.com July 2012 8 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

disk-based RDBMS to work with a RAM disk Figure 4 (a block of memory treated as disk) doesn’t Where Does In-Memory Fit? automatically make it an in-memory data - base. In support of that observation, a bench - Pure-Play In-Memory mark developed by McObject, a pure-play in- Solution Description In-Memory’s Impact memory database vendor, revealed that Complements Replaces Comments moving from disk to RAM disk accelerated Hadoop Clustered storage and data process - X Last mile for high-speed analytics on reads by about four times and writes by more ing for large volumes of distributed summary data. data, mostly unstructured. than three times. But moving from RAM disk to true in-memory improved reads further by NoSQL A suite of products to handle large X NoSQL for low-latency BASE and in- volumes of key-value stores, docu - memory for super-low-latency ACID. another four times and writes by a factor of ments, blobs and tables for which 420, according to the company. SQL’s structure isn’t well-suited.

So Where Does It Fit? Events/streams Processing huge volumes of in-flight X Can be the persistence layer for high- data before it’s finally committed to speed CEP analytics (summary data) and To put this into a practical perspective, we some form (mostly historical) of also has long-term capability to replace must realize that while the addressability of very limited persistence. Triggered CEP with built-in event-driven data. For 64-bit operating systems is theoretically vast, by events from RFID chips, mobile now, two very different use cases and hardware configurations and budgetary prag - devices, applications, etc. solutions. matics limit in-memory systems; today, they Caching Low-latency transactional systems X* Replaces for super-low-latency ACID but start at 64 GB RAM and go up to 100 TB in a with selective caching. otherwise co-exists for low-latency ACID lab setting. With data volumes in the “few more broadly. petabytes” range common for large-scale big Data grids Clusters of distributed caching X Fort distributed low latency, data grids data initiatives, that puts current in-memory solutions that mainly run will be more economical (although not appliances at the edges of these efforts—for in-memory. as fast as in-memory). Data: Sreedhar Kajeepeta now. This isn’t to say there’s no practical appli - * With qualifiers in comments reports.informationweek.com July 2012 9 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

cation; we lay out some scenarios in Figure 4 . ally supported by complex event processing rience and reality to planning and forecasting. As for scalability concerns within in-mem - or stream computing, these applications can Use case: Demand management ory’s sweet spot of super-low-latency transac - be further speeded along with complemen - A combination of push (sell your inventory) tional systems and the last mile of high-speed tary in-memory technologies in the areas of and pull (make what’s in demand) techniques big data analytics, note that there are varia - persistence and analytics. is based on real-time data. tions, from 100% in-memory offerings with no Use case: Arbitrage Use case: Supply chain disk at all and RAM going to 16 TB for certified Reliance on speed and near-real-time data is In-memory can help identify procurement Hana appliances, for example, to hybrid sys - even greater in arbitrage, which capitalizes on efficiencies, for example. Projects often in - tems that support “relaxed durability” of data - market imbalances of various financial instru - volve development of many models based on bases with no limitation on available memory. ments, including commodities and currencies. historical and comparative data involving. The This technology has established its value in Use case: Equity finance goal is to allow millions of database records telecom, aerospace and industrial control sys - Raising capital and managing the borrow - to be analyzed with the same speed, ease and tems. Now in-memory databases are chang - ing costs thereof require asset managers to simplicity of a small spreadsheet. ing the very notion of the speed, and related have a comprehensive and real-time view of business value, that can be derived from data - all financing transactions. Implementation Considerations base and analytical systems. Let’s break down Use case: Risk management When considering big data projects, a fourth some financial and manufacturing scenarios. The dynamic nature of counterparty credit “V”—value, besides volume, variety and veloc - Financial services risk involving high volumes of trading and the ity—is obviously critical. We recommend Use case: Securities market rapid-fire need to look at intraday margin calculations is starting with a granular pilot project, perhaps transactions found in securities trading in particular. in marketing or cost control, where value can Demand for near-real-time data calls for Manufacturing (but with a broader rele - be clearly articulated. In identifying such a pi - both in-memory databases and analytics. vance to many other sectors) lot, look for the following characteristics: Projects often involve developing many finan - Use case: Sales forecasting and ops planning > Consider the competitive advantages a cial models to support activities such as port - Heuristics can be complemented with real- process can gain by being more real time. The folio management and risk analysis. Tradition - time data to bring the dual advantage of expe - “custom ers” for this process could be internal reports.informationweek.com July 2012 10 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

or external, or even partners, such as dealer - Figure 5 ships. A process may benefit from receiving Vendors and Their Offerings data in real time as well as sending it out—if it takes only a day to assemble a car, for exam - Pure-Play In-Memory ple, why should custom orders take eight Vendor/Offering In-Memory In-Memory Notes weeks to be fulfilled? Database? Analytics? > For pilot candidates, look at mission-criti - Exasol EXASolution Y Y Has an appliance version too; analytics through Powerlytics cal spreadsheets (and we still have enough of Enea Polyhedra Y For embedded systems; has a “lite” version for low (1 MB) RAM IBM solidDB Y Acquired in 2008 those in the glass house), but think beyond McObject eXtremeDB Y Around since 2001; has a dedicated edition for financial services Like This Report? them. With in-memory systems, you can get SAP Sybase ASE Y Original SQL server, v15.7 with SAP; heavily used in securities the same speed and malleability, but with the SAP Hana * appliance Y Y Launched first for analytics, but has a DB now also ** Rate It! convenience of online systems that are always Lakshya CSQL MMDB Y Real-time applications with single table Something we could do Oracle Exalytics appliance Y Launched in 2011 better? Let us know. up to date, centralized and actionable. > Consider current pain points where infor - Oracle TimesTen Y Acquired in 2005 Teradata 700 appliance Y Powered by SAS’s High-Performance Analytics software Rate mation demand cannot keep pace with busi - VoltDB (open source) Y Community edition is open source ness evolution. Managing complex product Hybrid and/or Competing Offerings mixes or regulations or marketing bundles re - EMC Greenplum Y* With VMware’s GemFire, competes with in-memory systems quires multifaceted analysis to mitigate con - HP Vertica Y* Supports hybrid architectures tract risk while ensuring ongoing profitability. IBM Netezza appliances Y* DW in a box; competes with in-memory systems It’s important to note that while broader road Microsoft’s SQL Server 2012 Y* Y* With xVelcocity engine and PowerPivot; not shipping yet maps for platforms to manage big data are still MonetDB (open source) Y* A column-store DB; largely held in memory Teradata Aster SQL-H Y* Y* Acquired in 2011; supports hybrid architectures evolving, there is enough depth and traction in Data: Sreedhar Kajeepeta available in-memory systems that IT can fully * Hardware partners include Cisco, Dell, Fujitsu, Hitachi, HP and IBM ** Leverages P*Time acquisition of 2005 and TREX and MaxDB technologies support a well-defined project, as part of a *** Qualified yes (see notes) broader program to either create a new real-

reports.informationweek.com July 2012 11 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

time business need or support an existing one. way in demonstrating a sense of responsibil - able Hana development ser vices on Amazon However, it’s a good idea to do a detailed re - ity toward capacity planning and configuring EC2. Likewise, Hana-based platforms and ana - quest for proposal to ensure that broader goals the appliance. lytical SaaS offerings will be coming out soon. can be supported—these systems are expen - > Define analytical models that can be sive, and buying into the wrong vendor’s vision true enablers for decision-making, in support The Players, the Hurdles can be a career-altering experience. of the tactical and/or strategic goals of the In Figure 5 , where we run down in-memory When comparing costs, weigh the total cost project. The speed with which such models vendors and their offerings, you will see that of running the business as is with batch re - can be iterated and/or created is the real draw there are many qualified yeses. That’s a reflec - ports/spreadsheets and groups of personnel for in-memory database and analytics, and tion of the early stage that this market is in FAST FACT engaged in that workflow. Include lost sales op - must be taken advantage of. and of the way players are scrambling to say portunities and savings that could be realized. > Consider sources. When coupled with “me too.” Inclusion in our matrix is based Remember, in-memory systems can be run the power of such high-performance data more on the fit for a broader high-perfor - 60% on commodity hardware. For example, Steve transformation (or ETL) systems as Hadoop, mance computing setup as opposed to a of respondents to our Big Lucas, SAP’s global executive VP of business complex data collection and data scrubbing strict in-memory computing offering. Data Survey are somewhat analytics, database and technology, says certi - from many disparate sources can be well High-performance analytical (BI/BA) and vi - or very likely to invest in fied Hana servers from partners such as Fujitsu within the scope of an in-memory project. sually engaging query tools to complement technologies to manage cost as little as $12,000. In a blog post , Lucas > Conduct a proof of concept with an ar - in-memory databases include (but are not lim - big data initiatives within wrote that 95% of enterprises today use be - chitecturally significant use case; put an em - ited to) MicroStrategy, the R open source sta - the next year. tween 0.5 TB and 40 TB of data. For this market, phasis on performance testing and inclusion tistical software, Tableau’s VizQL visual SQL of - he says, at the low end (0.5 TB) the combined of all surrounding technologies, such as work - fering, Tibco’s Spotfire and QlikTech’s QlikView. cost of hardware and software of enterprise- flow and UI/visualization tools. In making a move toward in-memory tech - grade Hana is approximately $500,000. > Look to the cloud for available infrastruc - nologies, lack of budget may be the biggest In addition: ture-, platform- and software-as-a-service offer - roadblock. When we asked our Big Data Sur - > Group data based on volatility to develop ings and consider open source options. Pilot vey respondents about 10 barriers to success - a hybrid storage approach; that will go a long projects, for example, could benefit from avail - ful management of their big data, budget con - reports.informationweek.com July 2012 12 Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

Figure 6 straints was the No. 1 response, by 13 points Barriers to Successful Big Data Management (Figure 6). Pilots must be granular and laser What are the barriers to successful management of big data at your organization? sharp, while the business case justification Budget constraints should amortize costs over a period of time to 57% include other milestones in the journey to be Lack of knowledge of big data tool implementation 44% a real-time business. And be warned: Stand - Cost and availability of training alone TCO arguments won’t work unless they 41% are for established real-time businesses. You Operational needs or lack of strategic resources need to build aggregated arguments for new 38% transformations. There are other issues: Higher strategic priorities for IT > Because of the newness of in-memory 37% technology, there are no shortcuts to testing Lack of expertise or experience and learning. Seek a cost-effective means to 34% create training grounds, including leveraging Lack of knowledge of available big data management tools cloud-based and open source systems. 31% > Where SQL is a need for reasons of legacy Lack of management support 25% support and/or interoperability, latency will Lack of other resources (IT, fiscal, etc.) be adversely affected. 21% > Disaster recovery safeguards such as Difficultly finding and hiring qualified professionals transaction logging and save-points to disk or 17% use of nonvolatile RAM systems that are bat - Other tery-backed do exist, but again, will have 3% cost/latency implications based on specific No barriers exist implementations and offerings. 6% The deciding argument might just be the Note: Multiple responses allowed R4030212/14 Data: InformationWeek 2012 Big Data Survey of 231 business technology professionals, December 2011 reports.informationweek.com July 2012 13

Previous Next

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

fact that not jumping in now could result in a ranted; these systems will continue to gain with IBM hardware, that included 100 TB RAM lost opportunity, if the business were to come traction in mainstream business applications and 4,000 Intel cores. up with an acute need for a platform to lever - as we move to be more real time, more re - > As discussed, Hana development services age big data in the near future. This is not sponsive and more insights-driven. are now available on Amazon EC2, and SAP something you can do on the fly. SAP’s execution of the Hana road map is an is also developing a PaaS platform with Hana The discipline of developing in-memory sys - example of the acceptance and support that and new NetWeaver technologies. It’s code- tems involves more than just plopping an ap - in-memory computing is gaining. A few re - named Project River and will support a set of pliance into the enterprise reference architec - lated developments include the following: programming environments, including River ture. It’s a broader shift to potentially moving > SAP is expected to extend Hana database Description Language, Spring and Rails. It all your data into a centralized source to trans - support to the entire SAP Suite, including ERP, will also come with Hana database-as-a- act with, report from and analyze. The expertise by 2015; the aim is to move Hana from the cur - service. related to this transition will involve laying such rent analytics play to analytics plus database. Speaking of in-memory and cloud, RAM - a data foundation while baking in the right bal - > SAP has enhanced its existing Business Cloud, an initiative at Stanford University’s ance of handling unstructured data and events Warehouse for conversion to Hana, and the experimental data center lab, aims to create as we discussed earlier. As long as the road possibility of treating the Hana database as an very-large-scale, fault-tolerant and distrib - map and vision for a data foundation that is application engine is also in play. Predictive uted in-memory data clouds of hundreds of holistic and aligned with business goals is de - and statistical engines are candidates for the terabytes to a petabyte. The result would be tailed up front, the transformation can happen in-memory platform. to slash latency and deliver five- to 10-mi - in nondisruptive and graduated stages. Make > The latest Hana service pack comes with crosecond response times from application sure you budget for either staff training and integration support for Hadoop, opening the servers in the same data center. Such efforts Like This Report? certification programs or professional services. door for reading and processing vast volumes will take “in-memory-DB-as-a-service” to of distributed big data, mostly unstructured. new levels of performance, scalability and af - Share it! What the Future Holds > To demonstrate the system’s high-perfor - fordability, leveling the playing field for busi - Like TTweetweet One thing is certain: The excitement sur - mance computing and scalability features, nesses that want to take advantage of ultra - Share rounding in-memory technologies is war - SAP demonstrated a 100-node Hana cluster, fast in-memory analytics. reports.informationweek.com July 2012 14 Previous

Table of Contents reports IT Pro Impact: In-Memory Analytics and Databases

Want More Like This? InformationWeek creates more than 150 reports like this each year, and they’re all free to

E registered users . We’ll help you sort through vendor claims, justify IT projects and implement new systems by providing analysis and advice from IT professionals. Right now on our site you’ll find:

R Strategy: Hadoop and Big Data: The open source Hadoop ecosystem of tools and tech - LIKE THIS nologies can help companies tackle the broad problem of big data analytics, but not every project is Hadoop-appropriate. It is important for IT and business professionals to under -

O stand the security and privacy concerns around the technology. In this report, we examine where Hadoop came from, how it can be used today and how to determine whether it is the right solution for your company’s big data needs.

Research: 2012 BI and Information Management: Our 542 respondents say mobile, cloud

M computing and, above all, analytics are making their mark within nearly every IT category. That’s the case despite the fact that 63% worry about data security in using SaaS/cloud- based BI/analytics and 47% foresee integration issues.

Newsletter Strategy: Securing the Data Warehouse: Many enterprises are building data warehouses to Want to stay current on all new centralize the ever-increasing information flowing through their organizations. This makes InformationWeek Reports ? good business sense, but it opens up a slew of security concerns. IT pros can apply many of the Subscribe to our weekly newsletter and never miss security best practices used with databases, but there are new lessons to be learned, as well. a beat. PLUS: Find signature reports, such as the InformationWeek Salary Survey, InformationWeek Subscribe 500 and the annual State of Security report; full issues; and much more. reports.informationweek.com July 2012 15