September 2015 • datacenterdynamics.com DATACENTERDYNAMICS.COM

The Business of Data Centers

Get my bytes to the cool aisle now!

DATACENTERDYNAMICS • VOLUME 04 ISSUE 07 • SEPTEMBER 2015 Hey, back-up, I’m right on the edge

Watch my bits! THE You guys

CRAC me up! 9 HEART 772058 494001

OF IOT >

Contents September 2015 VOL 04 ISSUE 07 THE

Our tracks guide your path through HEART DCD magazines and events

IT + Networks App > Cloud Understand the Managing blended OF implications of IT and infrastructures and network transformation delivering critical ICT on design applications through and architecture. the cloud. IOT Design + Strategy Security The big picture: Ensuring reliability and organisational strategy protecting online data and design issues for on- and services from the premise data centers. data center. Cover Story Critical Internet Environment of Things Issues faced by professionals Will data centers be capable who manage the performance, of handling the volume of efficiency and resilience of the traffic the IoT threatens? critical environment. 21 36

5 21 32 Meet the team… Jon Koomey On the wagon … and send us your thoughts Our data centers run our Do our data centers drink too 40 businesses efficiently. Or do they? much? How do we rehab them? 6 Storage Editorial Max Smolaks on storage 22 34 Peter Judge explains why water ‘as market trends and why flash Latam DCD Converged Europe a basic need’ is often overlooked will take over the universe Patricia Màrmol blasts us off London is all about disruption with The Mexican Space Agency and and down the IT stack 8 Infographic 24 39 The news in figures that you DCD Pro DCD-as-a-Service can understand What’s on the agenda in training The latest speakers and topics at the Hilton, Chicago, October 11 25 News Open Compute in action 42 From subterranean tunnels in What bare metal products can Awards Norway to explosions in LA you actually buy? The latest news about the most important night of our year 18 28 Asia Pacific news Ian Bitterlin 43 Paul Mah on who’s shaping this Pouring cold water on those Dates for your diary growing region’s future wishing to reinvent PUE Highpoints of the DCD calendar 20 30 46 Apac Awards finalists Clouding the waters Viewpoint As the day of judgment Our global editor Peter Judge Can cloud really work while we approaches, it’s the final line-up gives his views on the PUE storm allow so many outages to occur?

Issue 07, September 2015 • datacenterdynamics.com 3

Meet the team

Subscriptions: www.datacenterdynamics.com/ magazine To email one of our team: firstname.surname @datacenterdynamics.com

ADVERTISING Bill Boyle Peter Judge Max Smolaks APAC Global Managing Editor Global Editor News Editor Jimmy Yu @BillBoyleDCD @PeterJudgeDCD @MaxSmolaksDCD EMEA Yash Puwar Power Man (Design & Green Guru (Critical Captain Storage Vanessa Smith Strategy). Covers power, Environment). Also, open (IT & Networks). Also, LATAM finance, mergers, corporate source, networks, telecoms, international incidents Daniel Clavero & telco data centers. international news. and data sovereignty. Santiago Franco USA Kurtis Friesen Jai Wallace

DESIGN Head of Design Ross Ellis Designer David Chernicoff Virginia Toledo Patricia Mármol Jess Parker US Correspondent Editor LATAM Assistant Editor LATAM @DavidChernicoff @DCDNoticias @DCDNoticias PUBLISHING DIRECTOR Jon McGowan Former CIO, test lab Editor LATAM edition Patricia is immersed in leader and developer. DatacenterDynamics. technology and says there CIRCULATION Our man in Philadelphia Breaking the moulds and is no better job than Manager gets his hands dirty. based in Madrid, Spain. journalism. Laura Akinsanmi

Find us online datacenterdynamics.com datacenterdynamics.es datacenterdynamics.com.br twitter.com/DCDnews Join DatacenterDynamics Global Discussion group at linkedin.com Michael Kassner Paul Mah Tatiane Aquim iPad App on Apple’s App Store US Contributor SEA Correspondent Brazil Correspondent @Datacenter Dynamics @MichaelKassner @paulmah @dcdfocuspt

PEFC Certified Our man in Minnesota. IT writer also teaching Our Portuguese-speaking

This product is Fifteen years’ enterprise and tech in Singapore. Deep correspondent with an from sustainably managed forests and IT writing on technology, interest in how technology in-depth knowledge of controlled sources science and business. can make a difference. Brazilian public sector IT. PEFC/16-33-254 www.pefc.org

UNITED KINGDOM USA SPAIN SHANGHAI SINGAPORE 102-108 Clifton 28, West 44th Street, Calle Alcalá 4 Crystal Century 7 Temasek Blvd Street 16th floor 3ª Planta Tower, 5/F, Suite 5B #09-02A, Suntec London New York, 28014 Madrid 567 Weihai Road Tower One EC2A 4HW NY 10036 España Shanghai, 200041 Singapore 038987 +44 (0)207 377 1907 +1 (212) 404-2378 +34 911331762 +86 21 6170 3777 +65 3157 1395

© 2015 Data Centre Dynamics Limited All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, or be stored in any retrieval system of any nature, without prior written permission of Data Centre Dynamics Limited. Applications for written permission should be directed to Jon McGowan, [email protected]. Any views or opinions expressed do not necessarily represent the views or opinions of Data Centre Dynamics Limited or its affiliates. Disclaimer of liability: Whilst every effort has been made to ensure the quality and accuracy of the information contained in this publication at the time of going to press, Data Centre Dynamics Limited and its affiliates assume no responsibility as to the accuracy or completeness of and, to the extent permitted by law, shall not be liable for any errors or omissions or any loss, damage or expense incurred by reliance on information or any statement contained in this publication. Advertisers are solely responsible for the content of the advertising material which they submit to us and for ensuring that the material complies with applicable laws. Data Centre Dynamics Limited and its affiliates are not responsible for any error, omission or material. Inclusion of any advertisement is not intended to endorse any views expressed, nor products or services offered, nor the organisations sponsoring the advertisement.

Issue 07, September 2015 • datacenterdynamics.com 5 Fresh views

on water 3 m ater is a basic need, for people and data centers, but one that can be so obvious it gets overlooked. It’s also an environmental concern as parts of the world suffer droughts. Should data centers worry? WMichael Kassner thinks so (p32), but Professor Ian Bitterlin points out that our usage of water is small in comparison with, say, Coca-Cola bottling or the urinals in McDonald’s restaurants (p28). Data center people have been considering water usage for some while, but with no real sense of urgency. There’s a trickle of interest in the subject now that might grow. A proposal has emerged to include evaporative energy in the warhorse PUE measure (p30), and this has been vigorously opposed by people, including Bitterlin. 1000 Water and power are interlinked – you can use water to reduce your electricity consumption, and in the process (says Bitterlin) reduce overall water use. Volume of water per year Efficiency levels are shocking, Jonathan Koomey for evaporative reminds us (p21), and only a consideration of the entire cooling a system can improve this. 1MW load One way to get at the entire system will be to instrument everything and use analysis to find where Water can be things can be improved. This is part of the promise of the Internet of Things (IoT), that by measuring so obvious we everything we can improve everything. overlook it. When the IoT hype machine gets going, data center people will have a sense of deja vu: our industry pioneered Perhaps the IoT a lot of IoT concepts (p36) with the fabled approach to DCIM (data center infrastructure management). will ensure we All this instrumentation, applied to water distribution systems, will increase efficiency, both within don’t do this the data center and in the world at large. One of the benefits of the IoT could be that it makes sure we don’t overlook the obvious.

• Peter Judge – Global Editor @PeterJudgeDCD

6 datacenterdynamics.com • September 2015, Issue 07

EU Data Center Spend BY OFFERING AND MAJOR COUNTRY, 2014, TOTAL €48.8BN

STORAGE SYSTEM

IAAS/PAAS

NETWORKING

UK

REST OF EU GERMANY SERVER 23% % 38% FRANCE 22 17% INFRASTRUCTURE

SPENDING BY OFFERING AS A PERCENTAGE OF TOTAL COUNTRY SPEND Server Storage Network Infrastructure Software Iaas/Saas GERMANY FRANCE UK REST OF EU

SPENDINGGerman BYy OFFERING AS AGerman PERCENTAGEy OF TOTALGerman EU SPENDy Germany

Issue 07, September 2015 • datacenterdynamics.com 8

MAKING THE WINNING MOVE ...requires careful planning. DCD’s Intelligence Division delivers analysis and advice on every aspect of the data center industry to help guide you to the right conclusion.

• Better-informed decisions • Effective planning • Accurate goal-setting • Efficient budgeting

• Regional trend analysis • Market measurement & sizing • Global spend forecasts • Emerging market analyses • Vendor briefings and case studies

DCDI research is available either on annual subscription or as a bespoke consultancy service.Contact us now to discuss your requirements: [email protected] www.datacenterdynamics.com/research News Roundup

Dell launches scalable unit has created a Datacenter Scalable Solutions (DSS) line selling optimized x86 servers for data centers in heavy-duty processing businesses such as telcos, web tech and hosting companies, and energy and research organizations. Lightning hits Google data Lightning strikes affecting a Google data center in Belgium caused some persistent disk storage customers to lose data, as some facility goes live, offering resiliency storage systems and backup services to its customers, were susceptible to according to Lefdal. While Rittal’s power interruptions Labyrinthine tunnels parent, the German-based Freidhelm at Google’s St Ghislain Loh Group, has signed a lease data center. Google agreement where Rittal will supply says permanent data in Norway to house containers and infrastructure. loss occurred in less A customized container than one millionth of a can be fitted-out, shipped to percent of the zone’s giant data center Lefdal, hooked up, and brought persistent disk space. online in six to eight weeks, said Lefdal’s marketing director, Mats Compass eyes A new data center in Norway will will give free sea-water cooling Andersson. Customers wanting to southern bells have more than a million square systems, while CO2-neutral energy work with other container vendors Compass Datacenters feet of space along a maze of comes from a variety of local can do so, though everyone has to bought rights to build underground “streets.” hydroelectric generators. take the infrastructure, which will facilities in three Built in an old mineral One of the streets will house be Tier III certified. new sites, potentially excavation in the Sogn og three floors of traditional space, But others poured scorn on bringing as many as Fjordane region between the west where racks and cabinets can be the idea. Andrew Jay, head of data nine of its standard- Norwegian ports of Måløy and installed. Each of the six levels of centers at CBRE, said: “It’s a great build data centers Nordfjordeid, the Lefdal Mine, the old mine site will be connected location, cheap, it’s green and cool. to 43.2 acres of land due to open in Måløy in August by a central access road providing But no one is going to compromise in Georgia, Texas 2016, could potentially offer 120,00 direct access to vacant chambers. a mission-critical operation to save and Ohio. It’s the sq m (1.3m sq ft) of space built into The project has backing from a few quid on their power bill.” company’s first foray a mountain, and situated within both IBM and Rittal: IBM, its first into southern states. 100 metres of a deep fjord. This partner, will move in when the http://bit.ly/1NK0hjW

5DATA CENTER BEST PRACTICES

Learn how you can THERMAL bene t on RISK MANAGEMENT NETWORK MIGRATION POWER OPTIMIZATION EFFICIENCY DCIM ENABLEMENT page 14

Issue 07, September 2015 • datacenterdynamics.com 11 RELIABILITY HAS A NAME. AND A NUMBER. 4DEGREES.

4DEGREES | CERTIFIED TIER 3 DATA CENTER, ULTRA-RELIABLE AND SECURE FOR YOUR SERVERS.

1-888-447-7225

DATA CENTER 4degrees-colo.com

VSA-5516-DCD.indd 1 2015-08-25 08:50

Document: VSA-5516-DCD Format (Po): 210 mm X 270 mm Coordo: Stéphanie Format (col x ag): -- ÉPREUVE Publication: Data Center Dynamics Safety: 9,525 Bleed: 5,001 mm mm Parution: 14 septembre FOND 03 par : Fernando Tremblay-Bouchard Jrnl Enr News Roundup

Facebook tax deal for third site Ireland has in Oregon too hard to pass up high hopes Local authorities in Oregon have signed off a multimillion-dollar tax deal for go-ahead that will bring a third Facebook data center to Prineville, a city in Crook County with a population of about 9,200 as of the 2010 Census. of Facebook Apple opened a data center there in 2013 with little fanfare. The 15-year deal was approved unanimously by Prineville €200m facility and Crook County authorities, according to Oregon Live. Facebook still has not formally announced it is going Ross when you PDF this please make sure the ahead, as it has been looking at an alternative site, but an announcement is expected by early next week. Ireland Facebook story headline is OK coz I Tax sweeteners for data centers have been a controversial issue, but Prineville clearly sees a benefit. had to fix the font manually for some reason With two large data centers and a “cold storage” archiving when I added the 200m to the headline facility in Prineville, Facebook has spent nearly $780m in a town of only 9,000 people – 147 of whom work for Facebook. At 487,700 sq ft (45,000 sq m), the new building will be bigger than either of the two existing data centers. Meath County Council has Facebook gets property tax exemptions, which have saved it $30m in approved Facebook’s plan to build three years, according to Oregon Live. In exchange, it pays a flat fee of a €200m ($220m) data center in $110,000 each year for the existing data centers. The new deal will add Clonee, County Meath. a further $190,000 a year. The center will be built in two Elsewhere in Oregon, The Dalles in Wasco County is offering tax phases over the next 10 years on incentives to Google, while other states are honing their offerings. Arizona a 220-acre site about a 30-minute is wooing further Apple investments with tax offers, and Philadelphia also drive from Dublin. News first has a tax regime designed to encourage new data centers. leaked out when Facebook began recruiting electrical engineers and http://bit.ly/1NXCV8D data center managers in Ireland in May. No start date is set, as the permission could still face objections. “This is the best news story that I have seen in County Explosion rocks data centers in Meath in relation to industrial development in my 30 years at the California’s City of Angels council,” said Brian Fitzgerald, chairman of Meath County Council, An explosion shook the basement of 811 Wilshire Boulevard, causing a fire when speaking on Louth Meath FM that was put out by 160 firefighters and which injured two people. The blast radio. is believed to have started in the building’s generator, and the resulting fire The first phase of the build caused network and power outages to the surrounding areas of Downtown will comprise eight data halls and Los Angeles, leading to problems with nearby data centers. office space in two buildings, with The fire took out an on-site power station in the building, which briefly a gross floor area of 50,800 sq m, left 12 buildings without electricity. designed to consume as much as At least two data centers lost utility power – the Equinix LA1 data center 72MW of power. There will also be a on West 7th Street, and CoreSite’s LA1 center in the One Wilshire building. new 220kV electrical substation on However, both switched over safely to generator power until the utility the site. power returned. The entire campus is expected However, some disruption was to employ around 100 staff, thanks experienced by network provider Level 3 to a level of automation already Communications, located in the Equinix displayed in Facebook’s facility in data center, which also affected its Luleå, Sweden. However, Facebook local customers. has said the site will bring millions Those affected at the Equinix of euros of business to Ireland and site included Internap and its support thousands of skilled jobs. customers, including remote access Ireland remains a popular firm LogMeIn. Customers at One location for data centers, with Wilshire seem to have escaped recently asking for without network troubles. permission to build its fifth data The blast also caused a few center in West Dublin, and minutes’ break to a Shania Twain Apple spending $930m (€850m) concert in the nearby Staples Center in on a massive data center in a Downtown LA. forest in Galloway. http://bit.ly/1ND3ToR http://bit.ly/1hidd41

Issue 07, September 2015 • datacenterdynamics.com 13

Advertorial: Anixter Battling for Thermal Supremacy in the Data Center

When poor airflow literally leaks money from your business, lines are drawn up and sides are chosen. Find out how proper thermal management can help restore harmony.

acilities vs. IT. When it comes to budget, efficiency and uptime, these two groups are usually at loggerheads on the right course of action to take—and usually have different mandates and objectivesF from executives that have the two working against each other. A frequent source of friction between the two is a common problem for all data centers: thermal management. Facilities groups are typically tasked with delivering cold air as efficiently as possible to the IT equipment that supports the business. IT wants availability and the means to install equipment quickly—regardless of how it might affect the data center’s overall thermal dynamic. This sets the battle: IT is rapidly adding equipment to support business needs and facilities is rationing the Thermal management: Aren’t we all on the same side? cooling supply. But is there a way to get these two accounts for 37 percent of a data center’s while keeping servers running at the right groups on the same side working together? energy consumption2 —12 percent of that is temperature to ensure uptime. Yes there is, and it’s simpler than one thinks. just to move the air. The next issue will cover the latest passive “The conventional thinking is that if there “Part of the problem is bad airflow cooling and containment strategies and how are hot spots in the data center, just throw management strategies, which leaks money,” changing industry practices can more cooling at it,” said Dave Wilson, said Tim Hendricks, Director of Advanced get these two sides working and being Director of Technology for Data Centers Technology Services for Data Centers at successful together. at Anixter. “But we need to focus on the Anixter. “The current strategy in many data delivery of airflow to increase cooling centers is akin to having the air conditioning 1 DCD Intelligence 2014 Global Census Data efficiencies, which includes better airflow on at home but with the windows open. 2 Source: EYP Mission Critical Facilities Inc., management practices.” Turning down the AC when you have leaks New York doesn’t make sense for a house and certainly The Cost of Bad Strategies not for a data center.” Data center architectures and designs are But there is a way to plug those rapidly changing, and they can affect overall leaks. By using airflow segregation thermal efficiency. Operators are facing more strategies, taking advantage pressure from the business to be efficient and of free cooling when possible, reduce operating expenses while managing and adhering to new industry an increase in IT capacity needs.1 So as guidelines, it’s possible to improve IT requires more cooling to support their efficiency, save on operating Contact Details growing infrastructure, facilities has started costs and lower overall PUE. And Phone: 1-800-ANIXTER to look at how to optimize its delivery. One when it comes to facilities and Email: anixter.com/datacenterdcd of the reasons facilities is focused on thermal IT, it’s possible to keep both sides efficiency is because on average cooling happy by lowering overall costs

Advertorial: xxxxxxAnixter In July Rackspace announced second quarter of 2015 had not cloud platform. AWS initially Rackspace to it would resell and support met the company’s hopes. had “low-touch” support, Microsoft’s Azure. Support The AWS service will likely but this has led to a strong support AWS for AWS had been rumored, resemble the Azure do-it-yourself ethos but was finally confirmed by service, which around the platform’s and Azure Rackspace in an earnings call Rackspace says early users. Since with the CEO, Taylor Rhodes. fills a gap in the then the sheer Rackspace is “building an Azure market scale of the Managed cloud provider offering for customers who for support and AWS market Rackspace has confirmed it will want specialized expertise implementation has called other offer support for customers and Fanatical Support on the services. support options of the Amazon Web Services AWS cloud,” said a company Rackspace into existence, (AWS) cloud, following an earlier statement. No details were is known for including official announcement that it would available at press time, but its managed AWS consulting resell Microsoft’s Azure public Rhodes said an AWS package cloud and supports partners such as cloud. Rackspace offers its own would be available before the customers’ private clouds, as CapGemini, CSC, Wipro, public cloud, but this business end of the year, after admitting well as VMware’s vCloud and Accenture and Datapipe. has not been growing, while that growth in Rackspace’s its own public cloud based on AWS dominates the space. own public cloud service in the the OpenStack open source http://bit.ly/1NXD8bY

Robot opens Switch’s Vapor and Bloom back SuperNap 9 in Vegas low-emission centers

Nevada-based Switch has opened more colocation Vapor IO, whose hyper-collapsed data center design space in Las Vegas, with local robot star Metal is based around cylindrical chambers, has teamed up Rebel cutting the ribbon for the SuperNAP 9 with fuel-cell provider Bloom Energy to offer low- site. The new 470,000 sq ft (440,000 sq m) emission data centers. colocation center supports up to 50MVA Vapor IO offers an open source hardware of power and brings Switch’s footprint in approach based on wedge-shaped racks built in Vegas to more than 1.5 million sq ft. It is circular “vapor chambers” that improve density and expected to have a PUE of 1.18. cooling by convection currents. Bloom provides Metal Rebel was created by the fuel cells that make “clean” energy from hydrogen University of Nevada, Las Vegas (UNLV), or hydrocarbons. Together the companies will and can climb stairs, drive a vehicle and build a reference model that couples the two in use power tools – skills that won it eighth a modular data center design. place in the annual US Defense Advanced While power distribution is traditionally a Research Projects Agency (DARPA) robot separate system from the IT within racks, the two challenge in June. companies will work together on ways to integrate The site has Tier IV design certification them more closely. and Switch says it will also apply for a Tier IV Vapor’s hardware is designed around an open Constructed Facility Certificate, as well as a Tier IV source management interface called the open data Gold Operational Sustainability Certificate from center runtime environment (OpenDCRE), which is, the Uptime Institute. Assuming it gets them, that in effect, an open source DCIM platform, published would make it one of only two Tier IV Gold neutral by the Open Compute Foundation. colocation sites in the world, says Switch. Vapor will give Bloom access to CORE, the Switch has said its goal is to eventually run commercially developed intelligence layer it offers all its data centers using 100 percent renewable on top of OpenDCRE, so the power generated energy. “Switch was started with the idea that from the Bloom Energy Servers can be routed more data centers needed to not only be able to handle intelligently. Each Vapor chamber has six racks $1bn the scale of the internet but to do so in the most and draws up to 150kW. Bloom Energy Servers are efficient manner possible,” said Rob Roy, who available in various configurations up to 200kW, Investment Bloom invented Switch’s cooling system, known as Wattage which is enough to power multiple chambers at has received Density Modular Design (WDMD). more normal densities. over 15 years to Switch has a long-standing relationship with Vapor emerged from stealth in March 2015, while commercialize Metal Rebel’s creator, housing the university’s Bloom has been around for 15 years, during which Cherry Creek Supercomputer in its Las Vegas time it has had $1bn of investment to commercialize fuel cells intended 7 facility. Cherry Creek, built with Intel, ranks fuel cells originally designed for Nasa and intended for Nasa among the world’s fastest and most powerful for space exploration. supercomputers. http://bit.ly/1MR5kys http://bit.ly/1fIGljz

Issue 07, September 2015 • datacenterdynamics.com 15 News Roundup 2km Buildings 2km away suffered severe damage and casualties in the blast image credit: Jason Kotteke image credit:

suffering damage to the exterior at full strength. No data center of the building. employees were injured. The explosions in Tianjin The National Supercomputing Port of Tianjin were so powerful they registered Center was also caught in the as minor earthquakes, and so blast, which shattered its windows, bright they were visible from damaged the façade and caused blast took out space. Apartment blocks located some internal ceilings to collapse. as far as 2km from the site had The center decided to their windows shattered, more switch off Tianhe-1A, the fastest than 100 people were killed, and computer in the world, from supercomputers several hundred were injured. October 2010 to June 2011 (now The exact cause of the residing at number 24). Its A series of blasts at the port and mobile device manufacturer tragedy is not clear, but evidence successor, Tianhe-2, is currently of Tianjin, China, damaged Ahmed Technology all located in collected so far points at an the most powerful machine in a nearby data center and the port’s immediate vicinity. industrial accident caused by the world – the position it has took down Tianhe-1A – once Chinese internet giant breaches of regulations and kept for more than two years. the world’s most powerful Tencent was forced to close its safety procedures. State-owned Xinhua News supercomputer. data center temporarily after the In the aftermath of the reports that no hardware was According to information blast managed to fling debris as explosion, Tencent had to take damaged, since Tianhe-1A resides on Sina Weibo, the area far as the server rooms. its data center offline, causing in a reinforced computer room – serves as a hub for technology Meanwhile, the National disruption to video services. it was shut down manually due to companies, with Hewlett- Supercomputing Center in The company said the repairs security concerns. Packard, recruitment site Liepin, Tianjin had to shut down its have already been completed, online classifieds site 58.com machines as a precaution, after and the facility is now operating http://bit.ly/1LyHS7v

VOX BOX / DCD VIDEO

What’s the deal with SKUs? How will containerized Hardware SKUs is a term borrowed workloads change things? from the retail industry to describe Linux-based application unique server configurations. containers are making virtual Containerized applications and machines obsolete. The web-scale computing will reduce the new approach, pioneered by number of SKUs present in the data companies such as Docker, has center, leading to the Holy Grail of advantages including application hyperscale – a facility running with portability, improved resource a single SKU. efficiency and lower I/O latency.

T J Kniveton http://bit.ly/1En62lO Bryan Cantrill http://bit.ly/1hicf7S Vice president CTO Salesforce Joyent

16 datacenterdynamics.com • September 2015, Issue 07 News Roundup

Netflix closes data centers and goes to Amazon’s public cloud

Netflix will have moved completely to the public cloud this summer, when it shuts its last remaining data center. The company began moving to Amazon’s AWS after a major failure in its data center in 2008, and will now become one of the first major companies to move completely to the public cloud. Netflix’s customer-facing streaming business has been 100 percent cloud based for some time, the announcement explains, but the process to move the entire business to the cloud has taken eight years, as Netflix’s technology and cloud delivery have matured. The transition has been gradual, with different components of the operation being completely transitioned to the cloud in a planned exodus from the data center. Along with customer-facing streaming, Netflix moved its Big Data platform in 2013, and 2014 saw billing and payment infrastructures make the move. Netflix is not, however, giving up on its own content delivery network, which it has been deploying over the same period of time to certain service providers. It will continue to install its own servers at those service providers, and will maintain control of the service delivery network outside of the AWS ecosystem. This delivery system is more important than the data center in maintaining a high level of performance to Netflix’s end-user customers, caching video at the ISP site in order to improve the performance of the streaming service and not requiring the network traffic to move outside of the user’s local provider. Still, with the closure of its final data center, all of Netflix’s IT operations will be cloud based, split between AWS and its own CDN. http://bit.ly/1hidFza

Equinix lands trans- Google and Amazon Atlantic cable covet Tata business

Equinix will deploy and sell fast links Tata Communications – a subsidiary on a new $300m cable joining Ireland of India’s largest conglomerate with New York. The 5,400km America – is reportedly selling most of its Europe Connect (AEConnect) cable data center business, with Google, from the West Coast of Ireland to Amazon and a number of private Long Island is being laid by the end equity funds all lining up as potential of 2015 and will provide more than buyers. On offer is a 74 percent 52Tbps of connectivity. It is being stake in the division that owns and laid at the same time as the Hibernia operates 44 cloud and colocation Express cable, and both are intended facilities, most located in India. to provide more capacity and lower Tata Communications – primarily latency between Europe and America. known as a network infrastructure The route between New York and provider – established its data center London is one of the world’s busiest. business to take advantage of the It is the second largest internet traffic Indian boom in colocation and route globally, carrying multiple managed hosting, and spun it off as a terabits of peak traffic. wholly-owned subsidiary in 2014. http://bit.ly/1U8JdJL http://bit.ly/1i557MG

Issue 07, September 2015 • datacenterdynamics.com 17 Key trends Growth – 35 million South East Asia’s internet users added in three months Green data centers Corporate social data centers look responsibility and government support Public cloud Traditional data centers forward to 2020 transform to the cloud Connectivity Direct links using dark Paul Mah talks to the movers and shakers in a fiber are gaining ground region gearing up for several years of growth

sia’s internet recently increased government drive to promote green IT – in by 35 million users in three new-builds as well as retrofits – as a way to months (the equivalent of the cement the city-state’s regional leadership. entire population of Canada), Not everyone agrees, though. Simon says Joe Kava, vice president Piff, associate vice president of enterprise of data centers at Google. infrastructure research at IDC Asia Pacific, AThat’s why he’s announced a second data says green data centers have “pretty much center in Singapore, barely 18 months after stalled” in the Asia Pacific region, apart from Google opened its first center there. in Singapore, Australia and New Zealand. We asked some industry experts to predict “In reality, the function of data center what kind of growth this would generate. management is separate from the department that pays the power bill, in most cases, and so Green data centers are growing in the the industry’s attempt to market ‘green’ to the region, says Martin Ciupa of Green Global IT community has not hit the target market Solutions. Facebook, Microsoft, Apple and that cares about power costs,” says Piff. Google are spending billions on energy Ciupa admits that mindsets need to efficiency, and in Singapore there is a strong change, but has observed that corporate social

18 datacenterdynamics.com • September 2015, Issue 07 South East Asia In its own data centers, TCC is one of the largest operators of hosted SAP applications in the region, with more than 10,000 users. Overall, data center operators do not feel threatened by the cloud, pointing out that it needs to be hosted in some physical data center. But Wong warned that data center operators need to be cloud-ready. “Data centers that are not looking into acquiring cloud infrastructure players, or developing strategies to be cloud-friendly, might be challenged in time,” says Wong.

Interconnected data centers is a final trend, in which operators eschew traditional telecom providers and use external points of presence, including rival data centers. In Malaysia, the high cost of connectivity has already boiled over and prompted Malaysia’s Multimedia Development Corporation to build a high-speed backbone. AIMS Cyberjaya Sdn Bhd had been appointed to connect data centers within Cyberjaya, which is located in the Multimedia Super Corridor region of Malaysia. Meanwhile, Superloop, which runs fiber optic networks between Brisbane, Sydney and Melbourne, expanded outside Australia into Singapore, offering services on dark fiber responsibility initiatives are becoming more between Global Switch and Equinix data important, and those companies that ignore centers that act as carrier hotels in the country. green initiatives may tarnish their image, or “Superloop sees the rise in ‘cloud’ data center fall foul of government regulations. traffic as a key feature for its entry into the Wong Ka Vin, managing director of data Asia Pacific region,” says Daniel Abrahams, center service provider 1-Net Singapore, CEO of Superloop. The company also has a wants to take green IT beyond increasing Unified Carrier License in Hong Kong. energy efficiency. He supports initiatives 1-Net has established what it calls a “data such as the BCA-IDA Green Mark for Data center corridor” linking 14 sites operated Centres in by AT&T, BT, Singapore, but Digital Realty, wants greater Equinix, Global adoption of Data centers that Switch, KDDI, renewable Pacnet (Now energy are not developing ) and strategies. cloud-friendly others. It seems “Energy clear that a wave efficiency does strategies might of inexorable not equate trends is to being be challenged sweeping across green. We the region. have to look Multi-year IT at how we Wong Ka Vin, contracts and can develop 1-Net Singapore the lead-time reusable required to renewable build a new sources.” generation The public cloud is growing and maturing of data centers will give existing players quickly in South East Asia, with providers temporary respite. But data center operators such as Amazon Web Services, GoDaddy, must start now or miss an opportunity. DigitalOcean, Rackspace and Linode There is a data center boom in the Asia establishing a physical presence here. “In Pacific region fueled by internet devices. The Thailand, more and more enterprises are only question is, ‘Are you ready?’ integrating cloud strategy to help grow their business,” says Waleeporn Sayasit, a Hear more about what the next five years will director at TCC Technology, a large carrier- bring at DCD Converged SE Asia, on 15–16 neutral data center operator. “I believe that September, in Marina Bay Sands, Singapore. traditional data centers will soon transform More information on the multi-track conference into cloud data centers.” program and the expo can be found online.

Issue 07, September 2015 • datacenterdynamics.com 19 IT’S OFFICIAL

The 2015 Pan-Asia Awards received over 80 entries from 12 nations – a new record! Congratulations and good luck to all the finalists:

1. ENTERPRISE 3. SERVICE 5. DATA CENTER 7. THE ENERGY 9. THE “OPEN“ 11. THE DATA DATA CENTER PROVIDER DATA CRITICAL EFFICIENCY DATA CENTER CENTER AWARD CENTER AWARD ENVIRONMENT IMPROVER’S PROJECT AWARD EVANGELIST Sponsored by Sponsored by TEAM OF THE AWARD AWARD YEAR Sponsored by Canberra Data Judges’ Choice: Sponsored by Centres Awarded on the Huawei night at the judges’ Huawei Langfang Equinix Inc PRO Nanyang discretion. Cloud Data Center Global Switch Technological Sponsored by ITC IO Singapore Canberra Data CenturyLink University Telstra Netmagic Solutions Centres Singapore Riot Games - Open Vodafone India Digital Realty Digital Realty Compute Project IBM India Hitachi Sunway Infosys Data Centre 12. AWARD FOR 2. INTERNET DATA 4. MODULAR Services 10. CLOUD OUTSTANDING CENTER AWARD DEPLOYMENT 6. CRITICAL NTT Singapore JOURNEY OF CONTRIBUTION Sponsored by AWARD ENVIRONMENT THE YEAR TO THE DATA FUTURE THINKING 8. DATA CENTER CENTER INDUSTRY Huawei AWARD TRANSFORMATION Dimension Data Asia Judges’ Choice: National Stock Sponsored by PROJECT OF Pacific (on behalf of Awarded on the Exchange of India THE YEAR Kathmandu) night at the judges’ Chinacache Xin Netmagic Solutions Sponsored by Equinix discretion. Run Technology StarHub Integrated Sponsored by Huawei Canberra Data Healthcare Shanghai Huadu Centres Beijing UniCloud Innovation Systems Architecture & IBM India Technology Hong Kong Institute Urban Planning Co IO Singapore Huawei of Vocational StarHub NEC IBM India Education Telstra (Lee Wai Lee) Opinion

Facing the

wenty-first century data budget. Only then will the team efficiency centers are the crown jewels focus on the system costs and Tof global business. No benefits of any proposed changes. modern company can run without Second, tie data center them, and they deliver business performance to business scandal value vastly exceeding their costs. performance, map data center The big hyperscale computing infrastructure costs onto business companies (such as Google, processes, and use metrics that Enterprise data centers are Microsoft, Amazon and Facebook) show the business implications shockingly wasteful, says extract that business value well, but of data center choices. Every enterprises whose primary business part of the business should be Jonathan Koomey is not computing do not do so well. able to compare total IT costs If you work in such a company, with benefits at the project level. you know that data centers are Most importantly, calculate (or often strikingly inefficient, with at least estimate) total costs and performance far short of what is revenues per computation. possible. And by “far short” I don’t Finally, the most advanced mean by 10 or 20 percent; I mean by companies use the power of IT to a factor of 10 or more. transform IT. Apply measurement This waste should be shocking and engineering simulation to to anyone who cares about financial optimize data center operations. performance. In a recent study Standardizing on a few server of 4,000 enterprise servers, my designs instead of dozens or colleagues and I found that about hundreds will reduce deployment 30 percent of times from these servers months to days. were comatose Moving smaller (using electricity Inefficiencies computing but delivering users to no useful do not arise internal clouds computing) – a primarily from will reduce result consistent deployment with findings technical times from days from McKinsey to minutes. and the Uptime issues, but Modern Institute. revolve around companies McKinsey found require that typical people and modern data enterprise server centers, but utilization incentives transforming “rarely existing exceeds” six operations percent. Many requires senior enterprises don’t even know how management attention. Those many servers they have, let alone inside the data center can’t make their utilization figures. it happen. Only management can These inefficiencies do not begin transforming data centers primarily arise from technical from cost centers to cost-reducing issues, but revolve around people, profit centers – and that’s a result institutions and incentives. They that everyone can cheer. start with fractured management chains that separate budgets for Jonathan Koomey’s online class about IT and facilities departments. Poor modernizing data center operations measurement and failure to apply will be available to managers, from the best technologies led to most October 5 through November 13, 2015, data centers underperforming. in collaboration with Data Center What can management do? First, Dynamics and Heatspring. centralize data center operations, http://goo.gl/K4kJG2 with one boss, one team and one

Issue 07 September 2015 • datacenterdynamics.com 21 s cybersecurity becomes more important, governments and companies are responding with safety systems to deal with DDoS attacks, persistent threats and identity theft. AAlongside the United States, Mexico is doing its bit. The Mexican Space Agency, together with the national government, has created a technology development center to prevent possible attacks. “Rather than protecting the country, the aim is to promote the development of technology for such protection, so that both the government and employers can adopt technologies developed in the center,” explains Hugo Montoya, director of innovation at Agencia Espacial Mexicana (AEM) – the Mexican Space Agency. AEM was set up in 2010 to develop entrepreneurship and to capitalize on the space sector (the tradition of astronomy in Mexico dates back to pre-Columbus times), including satellites and related services such as ground antennas and data-processing centers Mexico secures for satellite imagery. In 2015, AEM announced that it intends to send a payload to the moon, using the Astrobotic lander that has spun out of cyber space Patricia Mármol Carnegie Mellon University in Pennsylvania. Assistant Editor “Although we are the new kids on the LATAM block, we want to contribute to the protection of Mexico from cybersecurity issues and The Mexican Space Agency has created a @DCDNoticias natural disasters,” says Montoya. The agency technological development center to foster will set up its security technology center in Torreon, in the state of Coahuila. talent in the country, writes Patricia Mármol The agency will stand alone in this. It has already begun working with bodies in other countries to share best practices and generate the best possible center for technological development. For this reason, the organization has already visited Spain to start collaborations with various universities, including Complutense

Make IT easy. The TS IT Rack. nextlevel More than 100 ideas for your applications. We now offer the TS IT network and server racks in more than for data centre Now available Click on this link to select your 100 different predefined options. Thanks to its innovative and individual IT infrastructure: in more than 100 predefined modular construction, which also includes a wide range of acces- www.rittal.com/it-confi gurator options. sories, the delivery time for TS IT racks is significantly minimized. or scan the QR code.

22 datacenterdynamics.com • September 2015, Issue 07 RIT-CL-37096_CeBIT_2015_AZ_DCD_GB_RZ.indd 1 12.02.15 15:43 Latin AmericaMexico

and King Juan Carlos University in Madrid. agency currently has a small data center, based The project is still taking shape but will on servers in office space, since it currently develop in three stages. First, a series of has only administrative data. events will identify the talents, and needs, of Mexico’s existing professionals. Then the The core of the technology development size and scope of the center will be decided, center will be security, but it will also along with its exact location. AEM expects the foster the skills of young enthusiasts and center to be operating in 2018, with projects enterprises to develop projects, including that can be launched on the market. big data, scientific computation and artificial AEM says the institution will run intelligence, and accelerate economic profitably, using an open laboratory with open development through new specializations. source solutions: “We see good profitability It will also help to boost the country’s and productivity, with an open collaboration national standing by ensuring much of the platform,” says Montoya. production is carried out in Mexico: “The The Mexican tech market is familiar with idea is to have sovereignty over certain critical computing languages systems in the country.” such as Python, Ruby and And, finally, the Perl, all of which are in center aims to support use within open source Mexico’s talent: “Our software development Although idea in the space agency for satellite systems from is to contribute to the various countries. we are the benefit of society and The security center will new kids on put solutions developed need its own data center, by Mexican talent in the AEM: Key facts though, which Montoya the block, service of the government ranks as important as the and industry,” Montoya human talent the center we want to adds. The center will also Founded will recruit: “The rest is an contribute aim to export products to 2010 architectural shell,” he says. the international market Director General This data center is still and contribute directly to Francisco Javier Mendieta on the drawing board, Mexico’s economy. Jiménez (appointed 2012) but Montoya expects it to Despite having definite meet or exceed the design goals, AEM will have Headquarters standards set by the Uptime Institute to be a light touch, not judging but encouraging Mexico City certified with Tier III reliability. It will be developments so as “not to hinder the Security center built through a public tender process and will development of disruptive technologies,” says Projected to open in 2018 be modular and scalable in design, so by the Montoya. Data center size end of 2016, AEM should have agreed on the The bottom line is to strengthen 1,200 sq m (expected) dimensions for its power and cooling systems. competition through innovation. “The center The rough draft design is currently will facilitate the maturation and completion about 1,200 sq m (13,000 sq ft), including of technology projects,” he says. an auditorium, a training room, an administrative section, and a network and Hugo Montoya of AEM will speak at security monitoring center (NOC/SOC). The DCD Converged Mexico, 6-7 October

Make IT easy. The TS IT Rack. nextlevel More than 100 ideas for your applications. We now offer the TS IT network and server racks in more than for data centre Now available Click on this link to select your 100 different predefined options. Thanks to its innovative and individual IT infrastructure: in more than 100 predefined modular construction, which also includes a wide range of acces- www.rittal.com/it-confi gurator options. sories, the delivery time for TS IT racks is significantly minimized. or scan the QR code.

Issue 07, September 2015 • datacenterdynamics.com 23 RIT-CL-37096_CeBIT_2015_AZ_DCD_GB_RZ.indd 1 12.02.15 15:43 PRO 200 10,000 PLUS Online training modules in 2014 Professionals gained the DCS Credential 6,000+ 38% 93% Organizations only People attended classroom training invest in staff training Of our students would after system downtime take another course

Years old—average age Of downtime is due of datacenter engineering 80% to human error 55+ community Employees at companies with Of data center operators are inadequate training programs concerned about an industry 41% plan to leave 60% wide skills shortage Source: American Society for Training & Source: DCD Global Census Development $ million 20An hour is the true cost of downtime to your business

Want to increase staff productivity and protect your bottom line? Find out how today: www.dc-professional.com | [email protected] ove it or hate it, the Open Compute Project (OCP) has been making waves in the data center Flexible hardware world. And when Facebook decided Lto make its own high-performance data center switches, it put network David Chernicoff giants such as Cisco and HP on US Correspondent switches notice. But times they are a changing. It has been two years since @DavidChernicoff Facebook announced the goal of developing an OS-agnostic top-of-rack switch for data centers, limber up which would be shared through the OCP. And it’s been a year since the actual arrival of the TOR The Open Compute Project defined bare metal switch (codenamed Wedge), along commodity switches for data centers. David with a Linux-based OS to run it (codenamed FBOSS). Chernicofflooks at the industry response The Wedge TOR uses the same modular microserver architecture that Facebook was already deploying for the standard OCP server in its data centers. This means Wedge switches can slide seamlessly into Facebook’s existing software deployment, provisioning and management system. But just because something works for Facebook doesn’t mean Facebook calls it will work for everyone, and there this open modular are certainly other open switches switch platform in the OCP ecosystem. “6-pack” A selection of vendors, including Accton, Broadcom, Intel and Mellanox, have also provided reference designs that were accepted as open switches by the OCP.

Issue 07, September 2015 • datacenterdynamics.com 25 IT + Networks

On the software side, Big submitting their switch presumption that Edge-Core Switch Networks contributed abstraction interface (SAI) to will continue down this path Open Network Linux to the the OCP. SAI is complementary OCP features of development if the company OCP design (formally accepted to ONIE, with SAI abstracting Key elements of the OCP determines that there is sufficient in March 2015), and Cumulus the FPGA switch silicon to switch specification, as demand for the product. Networks offered the Open allow developers to write to a presented by Broadcom Even unexpected vendors Network Install Environment single specification instead of and Interface Masters have joined up with the OCP (ONIE) and ACPI Platform needing to tweak the code for networking group. At the Description (APD) technology different brands of silicon. • Open rack and beginning of the year Juniper to simplify the development of In March 2015, a design enterprise rack networks began shipping the software for these switches. specification was accepted by 1U-compliant leaf and OCX 11000 switch based on the OCP for next-generation 100GbE spine switch design OCP design, running the Junos The basic idea of ONIE and switches and a cost-optimized operating system and built in ADP is to enable a bare metal 40GbE design jointly submitted • Leaf switch partnership with Alpha Networks, network switch environment. by Accton Technology Corp and configurations: one of the early adopters of the ADP allows the creation of BIOS- Edge-Core Networks. 48x10GbE + 6x40GbE OCP networking spec. accessible information that can be Designs and specifications ports or 48x10GbE Networking giant HP has accessed by the operating system mean little if the IT department + 12x40GbE ports accepted the reality of OCP to dynamically generate all of can’t go out and purchase a networking and is working with the necessary control interfaces. product. And while a look at • Spine switch its long-time partner Accton ONIE the OCP configurations: to bring HP-branded OCP- defines the networking 32x40GbE ports compliant switch products to installation specs and or 96x10GbE + the market. Both Accton and environment designs page 8x40GbE ports Alpha Networks already have that combines Just because shows only OCP-compliant switch hardware a boot loader something two accepted • Supports ONIE and available for purchase, as do with the Linux hardware multiple off-the-shelf Mellanox and Inventec. kernel and works for products, both network OS options Edge-Core has introduced a Busybox, and from Accton, line of bare-metal switches that provides the Facebook drilling down • X86 control plane CPU come preloaded with ONIE to environment doesn’t mean will show that with server-class Linux allow the customer to load the for the other vendors OS and tools compatible switching operating installation of it will work have hardware system of their choice. Original the compatible built to more • Available options for design manufacturers such OS of choice. for everyone recent versions other MIPS/PPC CPUs as Quanta and ODM are also Because it is of the design building OCP-compliant hardware Linux-based, specification, • Support for optional for unnamed customers. it allows which are data plane co-processor While the OCP standard the switches to be managed, already in production. based on Broadcom has a lot of flexibility and provisioned and deployed in the There are also devices that XLP432 for L4-L7 can fit in as a solution for a same way that users currently are not part of the specification network functions significant number of data center manage Linux server hardware. process but are simply being built acceleration situations, it is not suitable for all Broadcom also submitted a to the model. These are being environments. Few companies specification, currently in the submitted to certification at the have the same type of data center final stages of approval, called the University of Texas San Antonio needs as Facebook or Google. OCP Open Switch, based on the OCP Certification and Solution Most data centers don’t have huge existing StrataXGS Trident switch Laboratory. Though there has numbers of servers performing technology (see panel). been some controversy about the same tasks, but are still Fast-forward a year, and in the facility and its procedures, suitable for the deployment of 2015, what were once white vendors are selling equipment OCP-compliant switching. papers and reference designs that has been certified by Cloud support, Hybrid SDN are now actual products in the the facility as OCP-compliant. networking and CDNs all seem marketplace. Facebook has gone to be a good match for OCP from focusing on just TOR Edge-Core Networks, for hardware. Where it doesn’t yet fit switches to building a switch example, has four certified is in environments that require capable of replacing spine and networking products tested by significant fault-tolerance. leaf switches within the data the lab for compliance. In March There is a reason why certified center, using the same basic 2015, Edge-Core was also the first fault-tolerant hardware is technology as the Wedge TOR vendor to release a commercial costly and very specific in terms switch. Facebook calls this open product based directly on of what software and OS is modular switch platform “6-pack” Facebook’s TOR design, the supported. The cost of building and published full details on the Wedge-16X, which is being and guaranteeing the level of device early in 2015. manufactured by Accton. With availability demanded by users Additional software the Wedge modular design with these requirements is at odds specifications have appeared as also being the core concept with the basic premise of OCP, well, with Big Switch Networks, for Facebook’s 6-Pack leaf and where the hardware architecture Dell, Mellanox and Microsoft spine switch, it is a reasonable is a commodity item.

26 datacenterdynamics.com • September 2015, Issue 07 Have you seen the latest monitoring solution from ? Opinion

As plain as the nose on your face…

Why are we even talking about adding water usage to PUE? It’s already covered! Ian Bitterlin explains

see that Emerson US has drafting ISO/IEC SC39 KPIs for proposed a new KPI that resource effective data centers, but I“captures the energy used in it has not let go of WUE yet. water consumption.” With power As a member of The Green usage effectiveness (PUE) widely Grid, I chaired a group to develop abused and little-understood, it is WUE a bit further, in my mind hardly likely a new KPI will find an to prepare for a standard. I (along audience other than in a comedy with my co-chair) published draft routine. In fact, all the energy used documents and expected the 40+ in water consumption within a data members of the WIG to comment. center (including the embedded There was only one response other Consuming energy of city water) is already than from my co-chair. Does anyone captured within PUE – if only care about water consumption? I more water people would take it seriously. think the heart of the problem is Four years ago, in a white paper, comparative consumption. locally reduces I showed that water efficiency gets Coca-Cola has a good reputation overall water worse as PUE improves. I called this for minimising water consumption WUF (water usage factor) instead of but still uses 2.5 liters of potable consumption the eventual Green Grid term WUE water to produce one liter of (water usage effectiveness). fizzy pop. If Coca-Cola reduced from power I went on to show that consumption by less than one station to consuming more water locally percent, the saving would be reduces overall water consumption enough for all the data centers data center, from power station to data center, in the world to be indirectly as electrical generation in thermal adiabatically cooled (except as electrical stations uses vast quantities of in humid regions, of course). generation water. An evaporative system uses McDonald’s waterless urinals each 1,000 cubic meters per MW of load save about the same water that in thermal per year – equivalent to about 20 1MW of adiabatically cooled data domestic houses – and reduces the center would consume. stations uses pPUE from a target 1.4 to 1.08. As a society we consume vast quantities If you harvest rainwater, you vast quantities of water, but the avoid the embedded energy proxy, consumption in data centers is of water equivalent to about 0.008 in pPUE. minuscule in comparison with the All of the energy for pumping, total and has the added advantage spraying and so on is included in of reducing electrical demand. the PUE. So, there is no need for a We don’t need to combine PUE combination metric – just a need to and WUE; we just need both to be formalize the embryonic WUE. used separately, and correctly. And there we have an interesting I think there could be a water problem. The Green Grid has metric KPI that includes impact given up the rights to PUE to the on the utility. Let’s call it WCF. international standards committee Because “woof” sounds daft.

28 datacenterdynamics.com • September 2015, Issue 07 Critical Power Monitoring for Data Centre Success.

Introducing the new StarLine® Critical Power Monitor (CPM). You want the best for your data centre. When it comes to power distribution, STARLINE gives you the best in efficiency, reliability, and customisation. Now the STARLINE CPM offers an enhanced monitoring package that will allow you to monitor, integrate and display more data centre power information easily and reliably. To learn more about how our CPM can help your data centre or other mission critical installation, please visit www.StarlinePower.com. PUE: clouding the waters

Is it really a good idea to add a water factor to the power efficiency metric? Peter Judge looks into it

ince the power usage effectiveness infrastructure for tasks (PUE) rating was created in like cooling and power 2006, a number of people have distribution. tried to improve on it. But the But there is more to latest effort has stirred up some efficiency than electricity controversy by trying to extend usage, say members of The Sit to include water use. Green Grid – the non-profit Peter Judge PUE was created to help data centers industry group that promotes Global Editor reduce their carbon footprint by cutting PUE. The group created a their consumption of electricity, which is separate metric for water usage @peterjudgeDCD normally generated from fossil fuels. To help effectiveness (WUE), which is reduce this, PUE offers an indication of how simply the ratio of water used to IT power. efficient a data center is at delivering electrical energy to the IT equipment it contains. It WUE has not achieved any real traction has a simple formula: the total energy used in the market, but water usage has become by a facility, divided by the energy that is a concern, particularly in parts of the world delivered to the IT racks. where water is expensive, or which suffer Obviously enough, any energy used from droughts (see p32). outside the IT kit will increase the top line of Meanwhile, concentrating on PUE could the fraction and increase the PUE, taking it actually be pushing data centers to consume further away from the ideal theoretical figure even more water, according to a recent blog of 1.0. So achieving a better PUE has become post by Jack Pouchet of Emerson Network a matter of choosing less energy-hungry Power: “Although it was clearly launched as a

30 datacenterdynamics.com • September 2015, Issue 07 Critical Environment tool to be used wholly within the confines of energy (WEE), and would thus increase the a long-term participant, for evaluation: “We a single facility, PUE has nonetheless become PUE of data centers that use a lot of water. are moving forward with this via The Green the de facto universal metric for comparison There is some justification for this, and Grid processes,” he says. shopping when evaluating data centers. This Pouchet’s idea has met with enthusiasm from But PUE has become well established behavior has led to numerous unintended some who see it “levelling the playing field” over nearly 10 years, and has even reached consequences, not least of which is the for data centers in warm and humid climates, the stage of being published as a draft staggering increase in the use of water by data where evaporative cooling won’t work, or international standard (DIS 30134-2) by centers.” in areas where water is in short supply and ISO – the International Organization for the method is unaffordable. PUE is a metric Standardization. Adding the WEE factor would Data centers have been moving away from created by the Northern Hemisphere, for the make the measure more complex and undo the mechanical cooling systems, which use a lot Northern Hemisphere, say the critics. work of creating that international standard. of electricity and comparatively little water, to But the WEE factor has been controversial. outside air cooling systems, which are often Pouchet uses the energy required to evaporate Moreover, Ian Bitterlin of Critical boosted by evaporative cooling. the water, which is 2,257kJ (0.627kWh) per Facilities Consulting (p28), says that water is The move is only possible in countries liter. While it is true that this energy is “used” already included in PUE. Behind the simple where clean water is readily available, in evaporating the water, it has come from PUE formula, The Green Grid and other and where the climate allows cooling available low-grade heat and has not incurred bodies such as ASHRAE have published by evaporation, but the effect on PUE any electricity usage or carbon emissions. a lot of information on how to calculate has been remarkable. Data centers that Robert Tozer of Operational Intelligence all the equivalent energy for parts of the use outside cooling have seen their PUE describes this as a “thermodynamic flaw” data center, including the water used in an drop from around 2.0 to 1.2 or less. in PUER, and others have levelled similar evaporative cooling system. “Read both V2 of The effect on PUE is quite legitimate, criticisms of false energy accounting. TGG and ISO 30134-2 and you will find the because evaporative cooling is a natural Essentially, it takes an item that doesn’t proxy kWh/ unit tables,” says Bitterlin. “The process driven by the difference in vapor feature on the energy balance sheet and adds energy for evaporation or pressurisation for pressure between the liquid and the air it to the debit side. adiabatic spraying has always been a part of around it. It does not need electricity to the facility overhead power.” drive it, and it is not responsible for If PUER caught on, it would certainly The bottom line is that Northern carbon emissions. Moving to evaporative have a significant effect on efficiency figures. Hemisphere data centers do indeed get cooling cuts the carbon emissions A 1MW data center with evaporative cooling a better PUE figure than those in other associated with a data center. might have a PUE of 1.1, but it would have climates. However, it is an unavoidable fact a PUER of 1.4. If it uses indirect evaporative that they also have a lower carbon footprint. What about the water use? Pouchet’s cooling, its PUER could go up above 2.0. Water and power are connected in a suggestion for “PUE revised” (PUER) adds in But will it catch on? It will be put to complex way. Mixing the two runs the risk of a factor for water usage, the water equivalent The Green Grid, where Pouchet has been clouding the issue.

PUE timeline 2008 2012 Google claims PUE of 1.21 PUE starts progress to ISO standardization

2006 2007 2010 2015 PUE proposed Promotion by WUE launched ISO DIS 30134-2 The Green Grid by Green Grid published

Issue 07, Septmeber 2015 • datacenterdynamics.com 31 The WUE metric Following the success of its power usage effectiveness (PUE) metric, in 2010 The Green Grid defined a metric to help discuss the amount of water used by data centers. WUE, or water usage effectiveness, is measured in liters/kWh and is defined as the site’s annual water usage, divided by its IT equipment energy. The WUE metric should include both water used on site, and water used off site; for instance, in producing the electricity used by the data center. WUE has not been widely used in the industry, and is only rarely quoted, but water use has become a more significant factor for data centers, leading to calls to find a way to incorporate water into the PUE metric (see p28).

old has increased seven percent in value since the mid-1990s. Real Do data centers estate jumped nine percent. Water,G not normally considered an investment, beat both, increasing 10 to 12 percent. drink too much? Now add shortages to the ever-increasing cost of water, and data centers known for using huge A typical data center drains an Olympic-size swimming pool quantities of water are being looked every two days. Michael Kassner investigates at with a certain amount of disdain. Is this justified? James Hamilton, vice president and distinguished engineer at Amazon Web Services, saw the problem coming in 2009. In his Data Center Efficiency Summit presentation, Hamilton mentioned that conservation is not just about power. It’s about water consumption; in particular, evaporation and blow-down losses. Michael Kassner Evaporation and blow-down US Correspondent losses are by-products of a common method used to cool data centers, @MichaelKassner where hot exhaust air exiting from electronic equipment is cooled by passing it through an air/liquid heat exchanger. The liquid coolant

32 datacenterdynamics.com • September 2015, Issue 07 Critical Environment

Each year, Then things got interesting. How much of that is used by California’s 800 In the article, Fitzgerald focused data centers? We need to divide data centers data on one US state – California 104,280,000,000 gallons by 365 consume – which he obtained from 451 days to get the amount of water enough water Research. California is home to 800 used each day by all 800 California to fill 158,000 data centers (more than any other data centers. That number is Olympic-size state), and each year they consume 285,698,630 gallons. Next, let’s swimming pools enough water to fill 158,000 figure out the percentage: dividing Olympic-size swimming pools. 285,698,630 gallons by 38 billion And, lest we forget, California gallons comes to 0.7 percent – is in its fourth year of a severe and Hamilton’s statistics agree. statewide water shortage. It seems that California’s data Hamilton recently centers use 0.7 percent of the water acknowledged Fitzgerald’s WSJ consumed in California. article on his Perspectives blog and reiterated his stance, saying: How does that compare? To “Water consumption is the next understand whether a bit less than big natural resource issue for data one percent is significant or not centers after power. I think it’s still when it comes to state-wide water true today and expect that water usage requires us to find out who consumption will continue to need else uses the state’s water and how to be managed carefully by all data much. Referring back to the USGS’s center operators.” California water usage report, we find that irrigation tops the list at Using swimming pools as an 60.7 percent (23,056 million gallons analogy may be a nice visual, but it per day), followed by thermo-electric does not equate well when trying to power generation at 17.4 percent determine just what it all means. So, (6,601 million gallons per day). the first step becomes determining One wonders, looking at those how much water is needed to numbers, why Hamilton and others fill 158,000 Olympic-size pools. are so alarmed. Combined, all of Thankfully, Marc Andreessen figured California’s data centers only use that out – 104,280,000,000 gallons. one-fourth of one million gallons Rather than just accept of water per day. Fitzgerald’s amount at face value, In the grand scheme of things also passing through the exchanger let’s compare his total to what data centers’ water usage is minor, picks up the heat on its way to Hamilton mentioned in his but leaders in the industry are cooling towers, another form of presentation – that a typical (15MW) concerned. As well as Hamilton, heat exchanger that uses water data center uses 360,000 gallons others have spoken out, including evaporation to remove heat from of water per day. If one takes a Lisa Jackson (who led the US the liquid coolant. second and reflects on that number, Environmental Protection Agency Besides losing water through it is somewhat mind-boggling until 2013 and is now Apple’s evaporation, data centers using that 360,000 gallons of water environmental director), and cooling towers also lose water (half the water in an Olympic-size Joe Kava, Google’s data center during blow-down. The liquid swimming pool) pass through a operations executive. They portion of the cooling system data center each day. view wasted water as inefficient and cooling towers build up Time for some simple math: operation, bad for the environment, sediment; cooling efficiency suffers multiply 360,000 gallons by 800 and ultimately bad for us. I suspect if this is not removed. (the number of data centers), then they would feel that way if the daily So, on a regular basis, the system multiply the results by 365 (days in a amount was even less than it is. purges a certain amount of water year). The answer is 105,120,000,000 “This is an area (water holding the sediment. These two gallons – close enough (you will see conservation) where the industry operations are the reasons why data why in a second). will keep innovating,” says centers use as much water as they do. 360K Now let’s try to gauge the Hamilton, “and I expect to see gallons of water used impact that data centers have results along the line as we have The Green Grid, a source of the each day by a typical on the overall water usage in the seen with power, where compute power usage effectiveness (PUE) 15MW data center state. Information needed to help usage has gone up dramatically but metric, launched a measure of determine that can be found in the power consumption has not.” water use called WUE (water usage 2014 US Geological Survey’s (USGS) In less than a decade data center efficiency) in 2010 and, although Water Use Estimates for California. operators have addressed the power Facebook reported a WUE figure The report states: “In 2010, issue and taken responsibility for in 2012 it has not been used widely Californians withdrew an estimate how much electricity they use, (see box, The WUE metric). of 38 billion gallons per day.” Just to with most of the major data center The issue of water use seemed get an idea of how much that is, the operations moving towards using to go quiet, until Drew Fitzgerald USGS’s report states: “Water use in 100 percent renewable power. penned Data Centers and Hidden the US in 2010 was estimated to be In another 10 years, we may see Water for the Wall Street Journal. about 355 billion gallons per day.” similar changes around water use.

Issue 07, September 2015 • datacenterdynamics.com 33 ‘Disruption’ takes center stage at DCD Converged This year’s DCD Converged Europe will cover the Full IT-Stack. Here’s what’s in store in London this November…

CD Converged Europe infrastructure and how it will shape returns this year with an enterprise data center strategy, to the expanded ecosystem. With changing nature of security threats in ever- DCD is a great new features, topics, speakers more distributed critical environments. and interactive sessions, ‘South of the rack,’ we will explore how ideas platform, DCD Europe is set to connect critical infrastructure requirements are providing Dthe entire IT stack, debating everything changing, especially given the trend towards north and south of the rack. more hybrid solutions, and what DevOps up-to-date The data center is in a constant state of add to this equation. disruption. In the physical infrastructure, “DCD Converged has transformed over information on software-defined networking (SDN) is the past four years to become a truly holistic technology and best disrupting networking, flash is shaking up event that covers the whole IT stack, as storage, and hypervisors have transfigured well as the mission-critical services and practice in cloud how we buy servers. For managers, DevOps engineering element of the industry. We are and lean business models are changing creating a collaborative environment, where environments attitudes and application strategies. And, the many professional disciplines that touch of course, cloud and open source are the data center within large-scale operators everywhere. and end users can come together to find TEOMAN BUYAN Against the backdrop of constant cross-stack solutions,” says George Rockett, disruptions, Europe is emerging as a global CEO and co-founder of DCD Europe. Coca-Cola tech hub, bringing forward IT infrastructure requirements that create a vast array of This year has so far been the year of opportunities across the region for global the hybrid cloud, and it doesn’t look as if industry players. it will slow down any time soon, as the adoption rate among enterprises continues DCD Converged Europe will tackle to experience a significant boost. However, these new disruptive technologies, highlight as the hybrid cloud is on course to becoming what’s on offer from the European market, a mainstream solution within the next and connect 3,000-plus senior business, few years, it inevitably leads to complex operations and technology representatives architectures, emphasizing the need for a from across the entire IT stack. cloud-centric data center strategy capable of Major topics ‘north of the rack’ that supporting hybrid IT environments. will be covered this year range from the One of the world’s most recognized rapid proliferation of software-defined brands – Coca-Cola – will be represented at

34 datacenterdynamics.com • September 2015, Issue 07 Converged

Leading speaker line-up:

TEOMAN BUYAN Coca-Cola

MARTIN HEISIG SAP

TROELS OERTING Barclays

DORON KEMPEL SimpliVity

This year’s event will feature some of the DAVID NELSON industry’s leading cloud technologists Boeing

the event by its EMEA CIO and global head of infrastructure, Teoman Buyan, who will be discussing the company’s ambitious journey @DCDConverged DR. JR REAGAN to the cloud. “We are aligning our strategy to Where you can follow Deloitte the conversation… heavily leverage cloud platforms as much as #DCDEUROPE possible to lower costs and provide flexibility. While security and data concerns prevent us

from adopting the public cloud across the board, hybrid cloud models represent the best model,” says Buyan. PAUL WILSON Bristol is Open DCD Europe The event will also feature some of the largest users of cloud services and leading cloud technologists, including executives from Amazon Web Services, Boeing, Deutsche Bank, Deutsche Börse Cloud Exchange, ROB SHERWOOD Google, Mergermarket and SAP. Big Switch

Stacking IT, an all-new knowledge component to our conference tracks, will focus on the open-sourcing of the data center and the potential disruption of moving from PIERS LINNEY proprietary to non-proprietary frameworks Outsourcery for technology development and acquisition. Discussions around cloud, IaaS and other outsourced data center services will feature prominently across the whole conference BOB PLUMRIDGE program, with relevance to different SNIA stakeholder communities.

Europe 2015 18–19 November 2015 LANCE SMITH ExCeL London, Prince Regent Primary Data www.dcdconverged.com/ converged/europe

Issue 07, September 2015 • datacenterdynamics.com 35 Everyone is talking about the Internet of Things. Data centers are already using it explains THE Rupert Goodwins HEART OF IOT Regular or charred?

36 datacenterdynamics.com • September 2015, Issue 07 CriticalCover Feature Environment

30bn Number of IoT devices by 2020

he next industrial revolution, we’re told, is the Internet of Things (IoT). Smartfridges will order smartfood, while your smartwatch will tell you when the next smartbus is due. TThe idea is that we add automation Time for and control to the way our machines work together. As this cuts across the normal toast yet? compartmental barriers, new insight, efficiencies and opportunities will drive new business models and create the magical more-from-less that’s the perennial promise of IT to its masters. Data centers will be at the heart of all this, marshalling and analyzing the huge new data flows that power the IoT – and to do this, they will need a lot of new engineering to balance their own efficiencies, costs and capabilities. A lot of basic engineering awaits, and a lot of people are keen to sell data center companies new hardware and services to help support the transition.

The data center itself is a candidate for IoT techniques. Quite aside from the normal considerations of monitoring and controlling client data flow, storage requirements and access capabilities, the machinery of the data center can be intelligently linked to provide efficiencies that reduce operational expenditure and create new client services. Data center infrastructure management (DCIM) is a well-known product sector that has been providing an embryonic IoT within the facility for a while, as it learns to mesh information from many different sources within the data center. So far, DCIM has tended to focus on automation and alerts rather than data analysis. This is where work in IoT thinking can feed back into DCIM. Normalization is very important. One of the challenges for IoT is that while many machines and existing intelligent controllers generate the right operational data, it’s in a proprietary or uncommon format. Normalization means translating all of that into a common format before storing it, and that’s essential for any long-term analysis. There are a plethora of industrial protocols – Modbus, BACNet, SNMP, WMI being just some of the more common – that u

Issue 07, September 2015 • datacenterdynamics.com 37 Cover Feature

u need a common transport and translating be replaced on a regular cycle, whether they to a common nomenclature before finally need it or not. If the fan units report speed turning the actual data into common units. or current draw, dirty filters are easily detected Standards: a round-up as RPM, or power requirements start to rise For example, one power management – no inspection needed, but much better MODbus – Machine-to-machine unit might report data as OUTPUT_CUR, resource management. industrial control protocol, another as DCRAIL1, with the first being an This is also an example of lifetime commonly found in switching or integer representing amps and the second a management, which becomes much simpler single-variable sensor applications. floating point number with tenths of an amp with an IoT approach, which not only reduces resolution. Within a particular DCIM suite costs but has significant safety and uptime BACNet – Building automation or the manufacturer’s own software, this ramifications. In the same way that hard disk standard for access control, doesn’t matter, but to compare the two either error-monitoring catches not just instances heating/cooling, fire and security. in real-time or over an extended operational but patterns of failure, more in-depth period, these must be stored with the same knowledge of UPS battery conditions and SNMP – Simple Network name in the same units. thermal cycling within cabinets can constrain Management Protocol. Used to Why bother? Well, there are at least two risk and reduce costs. control and monitor IP network reasons. Taking an IoT One of the reasons why devices and ancillary equipment approach will allow IoT is being promoted so such as routers, servers and UPS. you to understand and heavily is that vendors expect control your operations it to provide a single software WMI – Windows Management through a common set IoT isn’t about and hardware integration Instrumentation. Microsoft of tools, which should be toothbrushes platform for vertical protocol for linking Windows- able to cope with future industries – healthcare, based management systems changes. It will also be ordering transportation, energy, retail together, with very wide adoption useful to your customers, and, yes, IT itself – which in third-party management tools. who have an interest in toothpaste. It’s have traditionally developed knowing the efficiency about bringing and guarded their own MQTT – Message Queue Telemetry of their operations. infrastructure fiefdoms. Transport. Device data collection Having a common, industry This provides the same protocol aimed at telemetry deep data set, you can economy of scale for industry and remote monitoring of large easily provide much more into the as the internet has done networks of small devices. information as a service internet age for information. to them, or provide tariffs Thus, in all areas of XMPP – Extensible Messaging that reward efficiency industry infrastructure the and Presence Protocol. Originally while keeping margins. next generation of smart designed for instant messaging, There are already signs devices, including those it’s proving popular in the Internet of this in the data center market – American common in the data center, will start to talk of Things as a way to remotely colo company RagingWire gives its customers IoT protocols, in much the same way that address devices simply and pass access to a wide range of internal data through all pre-internet networking standards were structured messages. its N-Matrix service, with visualization tools to subsumed or rendered extinct as the IP-based help map their usage of its resources and plan systems took over. ahead – but IoT will create more opportunities These standards are already being for everyone to develop these ideas. introduced. Last month, GE and Pivotal announced the development of open-source Once a lot of normalized data is available, support for IoT protocols such as MQTT, You’re the new models suggest themselves. You can CoAP and DDS in the Cloud Foundry strong silent type? relate power usage to changes in your consortium. Cisco is actively developing workload by combining server reports with MQTT, CoAP and XMPP, and IBM, well, it electricity meter values or thermal changes, invented MQTT, while Intel has it at the core and then use this to drive automated load of its IoT thrust. On one level, these standards amalgamation within your data center, or simply do the jobs that older, proprietary even transfer loads to other sites. protocols did, but this generation of protocols At one extreme this lets entire sections of are explicitly designed to work over the your equipment be idle, while at the other it internet, to interoperate, to be supported can provide early warning of the need for more by entire industries, and applicable to the provisioning. Long-term trend and correlation widest range of uses. knowledge is the only way to do this efficiently, and that requires IoT techniques for The Internet of Things certainly isn’t large data set compilation and analysis. about toothbrushes ordering toothpaste or By bringing more of the machinery of monitoring your central heating from a beach the data center into the overall management in Rio de Janeiro. Its first and most abiding system, other savings are possible. Ambient impact will be in bringing industry into cooling can improve power usage effectiveness the internet age. (PUE), but it needs filters on the fans that In many ways, data center management got pull in the outside air. And these clog: as there first – but there is still a risk that it might regular inspection is expensive, they tend to miss the next big wave.

38 datacenterdynamics.com • September 2015, Issue 07 October 27–28, 2015 | Hilton Chicago

REGISTER NOW! as-a-service THE OUTSOURCED INFRASTRUCTURE SERVICES TRANSFORMATION EVENT

2 55+ 70+ 100+ 1000+ DAYS SESSIONS SPEAKERS EXHIBITORS ATTENDEES

FEATURING Lex Coors John Hughes CBE Telecity Group

Joe Weinman Digital Strategist, Author

Oliver Jones Chayora

Maximilian Ahrens Deutsche Börse APP > CLOUD Cloud Exchange DESIGN + STRATEGY Eli Scher Continuum Data CRITICAL ENVIRONMENT Steve Hebert Centers Nimbix IT + NETWORKS

SECURITY + RISK #DCD2015 www.dcdconverged.com/services or years, the data center storage landscape has remained fairly simple: you used 7200rpm (7k) hard drives for most of your daily workloads. Demanding tasks asked for high-performance 10k Fand 15k drives, which were later replaced by flash memory. Barring exotic solutions for Max Smolaks especially paranoid organizations, archives News Editor were kept on magnetic tape.

Today, the industry has mostly gotten to @MaxSmolaksDCD grips with flash, but the overall growth in storage performance has pushed equipment manufacturers to dedicate more resources to the development of even faster ways to read and write information. If you thought the transition to flash was messy, complex, full of far-fetched promises and underwhelming delivery, wait until you see what’s coming next. To keep up with the performance of processors – which, as we know, doubles approximately every two years – servers of the future will need fast, non-volatile memory to occupy the position between NAND and DRAM. DRAM – DDR4 in particular – is really fast, but it’s a massive power hog and can’t store information once the machine is turned off. Flash can store data without using any electricity, but it is too slow for in-memory databases, and while the latest chips last considerably longer than their ancestors, even enterprise-class flash is only good for about 30,000 read/write cycles before the memory cells wear out. With this in mind, we list four of the most promising non-volatile memory technologies, two of which are set to appear in enterprise storage devices next year.

PCM (Phase Change Memory) – PCM cells consist of two electrodes on either side of a Upcoming memory chalcogenide glass. By applying heat, the glass at-a-glance can be switched easily between an amorphous (low conductivity) and a crystalline (high Phase Change Memory conductivity) state, changing its optical and Made by: IBM, Intel, electrical properties to record data. Micron, Samsung. PCM chips last much longer than flash Expected in 2016. – between 10 and 100 million write cycles. They are also resistant to radiation, unlike ReRAM flash, which is unsuitable for many space Made by: Viking Technology, and military applications. SanDisk, Micron, Sony, Panasonic, Phase-change memory has been proven Crossbar, HP, SK Hynix. to work, but currently does not meet the Expected in 2017. density and cost requirements for enterprise applications. Intel previously predicted it 3D Xpoint could become cheaper than NAND memory. Made by: Intel and Micron. PCM is being developed by IBM, Intel, Expected in 2016. Micron and Samsung, and enterprise storage devices based on this technology Racetrack memory are expected in 2016. Made by: IBM Expected - we have no idea, but it ReRAM – Just like PCM, most types of will be worth waiting for. ReRAM don’t feature transistors. Instead, a memory cell consists of three layers: a metallic top electrode, a dielectric

40 datacenterdynamics.com • September 2015, Issue 07 IT + Networks The race for memory

High-performance storage contenders are on the starting blocks. Max Smolaks says there will be a shake-up in 2016

switching medium, and a bottom non- are planning to launch the first Optane metallic electrode. devices some time next year. Rather than trapping electrons like DRAM and NAND, this memory type uses high Racetrack Memory – Racetrack Memory voltage to either form or break a conduction is being developed at IBM by Professor Stuart path through the dielectric medium, thus Parkin, a world-renowned physicist who, in changing its resistance. the 1990s, invented spin-valve technology that ReRAM consumes less energy than any of enabled 1000-fold growth in the capacity of its competitors. It also produces less heat than hard drives. PCM, which means it’s easier to build larger- Racetrack Memory writes magnetic capacity storage devices. information onto nanoscale wires made out of Hamid Shokrgozar, president of Viking a special alloy, each storing hundreds of bits. Technology, told DatacenterDynamics that The bits are then transported along the wires the manufacturing process for ReRAM is to the read-write heads using ‘spin polarized’ relatively simple and the components are electric current, produced by a transistor. expected to be very cost-effective. He added 100k Racetrack memory is still in the research that microcontroller chips with embedded phase and may take longer to reach ReRAM are already on the market, used in Number of write cycles supported commercialization than PCM or ReRAM, but consumer products such as home security by new generation memory types the potential performance, endurance and systems, smoke alarms, and portable blood density characteristics here are very impressive. pressure monitors. Earlier this year, Parkin and his team ReRam is being developed by Viking published a paper which confirmed that the Technology, SanDisk, Micron, Sony, project is firmly on track – no pun intended. Panasonic, Crossbar, HP and SK Hynix. It But exactly when it will produce something will appear in high-performance enterprise that can be put inside a server is anyone’s guess. storage devices in 2017. This is by no means a complete list. It 3D Xpoint – Pronounced ‘crosspoint’, misses such beautiful specimens as Magnetic this is a very recent addition to the alternative RAM (MRAM), Spin-transfer Torque RAM storage roster. (STT-RAM), Ferroelectric RAM (FeRAM), and According to Intel and Micron, 3D Xpoint probably a few others. relies on layers of switch and memory cell The use-cases for faster non-volatile materials that can permanently change memory already exist, and certain their state. The company has denied that technologies envisioned about 30 or 40 years the underlying technology is based on ago are about to reach maturity. In the next PCM or a form of ReRAM, claiming it has two years, your storage choices are guaranteed invented a completely new type of memory – to get a lot more diverse. At the same time, we something that remains the subject of intense see software-defined equipment helping to speculation online. unite storage devices into abstract tiers, based What we do know is that 3D Xpoint solely on their performance characteristics. features no transistors and offers individually Taking this into account, next-generation addressable memory cells. The technology storage devices will be easy to manage, will be used to produce PCIe SSDs under the but that doesn’t mean they will be easy to Optane brand, followed by a new class of yet- purchase. We are about to enter another unnamed DIMM storage devices. ‘hype cycle’, and considering the number of 3d Xpoint is a proprietary technology competing technologies, this one is going to owned by Intel and Micron, and the partners get rough. God help us all.

Issue 07, September 2015 • datacenterdynamics.com 41 2015 IT’S GETTING SERIOUS

Several categories are spoken for, but there’s still time to put your name where it belongs – at the top of the list!

SPONSOR AN AWARD TODAY!

For more information visit: www.datacenterdynamics.com/awards

HEADLINE SPONSORS

CATEGORY SPONSORS

PRO

PATRON SPONSORS DCD Community When and where to catch up with DatacenterDynamics around the world

Conference Snaps What You Said

Michael Mudd The Open Computing Alliance DCD Internet Panel discussion Stacking-IT in full swing There is a lot of concern about the Cloud somehow being less secure than on premise data hosting. If you are describing the dominant full service providers such as AWS, Microsoft or Google, or Cloud DC hosting services such as Equinix, BT or NTT and others, then this is a myth.

Overview of DCD Internet Sharing knowledge at DCD Internet

DCD Converged upcoming events

DCD EUROPE ExCeL London, Prince Regent November DCD AS-A- 18-19 , 2015 SERVICE Hilton Chicago DCD SE ASIA September 27-28 Marina Bay Sands, DCD COLOMBIA Singapore Centro de September Convenciones 15-16 Compensar September 23, 2015

Issue 07 September 2015 • datacenterdynamics.com 43 DCD Com munity We have gone out to the DCD communuity to find out what you have been doing over the summer and what you are looking forward to taking part in during the next few months

George L Williams Cummins Power Systems The show is a great investment, not only from a vendor perspective or as a speaker, but also for networking and peer interaction. The open vendor area and access to meeting areas within the vendor area all promote a more relaxed atmosphere to conduct business, as well as gather information. Stevan Plote TJ Kniveton BTI Systems Salesforce.com DCD provides another view of [Server] consistency and the key topics in the industry. making sure the software It is different from the standard can easily control the fleet of set of conferences we normally servers is the reason why SKUs get emailed and contacted are important to us. about. I highly recommend this alternative view to the * SKUs (Stock-Keeping Units) – critical topics being addressed A term borrowed from the field in the industry. of inventory management. Tanja Lewit Kentix US You spoke my language – Open Source. Open API IoT… we are a new Open API IoT product, so the future as discussed is now! Sherrie Brown Littlejohn Wells Fargo My hope is that with cloud computing, infrastructure becomes a less integral part of what it is that we do, and the developer can now concentrate on not worrying about the [infrastructure] environment that the application lives in but what can the application do for customer’s businesses.

44 datacenterdynamics.com • September 2015, Issue 07 Events Calendar DCD Com munity

DCD Awards Research New Webinars From ASIA PACIFIC OUT NOW The Leaders In Data September 16, 2015 REPORT Marina Bay Sands, Singapore Metropolitan Hub Series: Center Technology US West Coast LATIN AMERICA LIVE CUSTOMER ROUNDTABLE: October 6, 2015 REPORT OPTIMIZING CAPACITY Club Ragga, Mexico City Global Data Center Market Overview and Forecasts 2014-20 Tuesday, September 8, 2015 BRAZIL 12.00pm (EST), 5pm (BST) November 10, 2015 COMING SOON Buffet Dell’Orso, São Paulo Colocation Provider Investment: SPEAKERS: Co-opting The Cloud For Future US & CANADA Growth (September 2015) December 1, 2015 Capitale, New York City Baidu, Alibaba, Tencent Growth: William Bloomstein Implications for Data Centers Director of Strategic & Solutions EMEA Globally (September 2015) Marketing, CommScope December 10, 2015 Lancaster London Hotel, London

Ciaran Forde VP of Global Data Center Business, CommScope Henk ten Bos Ageas Insurance Company (Asia) Limited The DCD Forum in Hong Kong was very informative. The sessions contained good sharing of vendor solutions and Scott Holgate Matthew Larbey experiences of customers. Four different Director of Project Director of tracks enabled everyone to prepare Management & Product Strategy, a customized program. The DCD app Infrastructure Design, VIRTUS Data Centres added a nice engaging and interactive University of Montana element to the conference. The biggest challenge facing many data centers today? Capacity: how to optimize what you have today. And when you need to expand, how to expand your capacity smarter.

Learn from the experts about how DCIM and Prefabricated Scott Walker Modular Data Centers are driving best practice in how Jones Lang LaSalle capacity is managed and optimized. DCD in Hong Kong for me was a great chance to network with peers, and www.datacenterdynamics.com/ seeing the new technology and trends webinars/commscope-webinar in the market was very interesting.

Issue 07, September 2015 • datacenterdynamics.com 45 Viewpoint

Planning is a skill

t’s all very well planning our future in the cloud but there is one thing worrying me at the moment – our international skills shortage. So while corporate IT spending is still on the up-and-up there is a skills shortage. In the next two years most analysts agree that I the burgeoning number of new data center technol- ogies, from the much-anticipated renaissance of data center infrastructure management (DCIM) to the spread of software defined networking (SDN) and network functions virtualization (NFV), will mean that the IT technical workforce in the USA alone needs to expand by 50 per cent by 2016 just to keep pace. If you look at the skills wars that took place in San Francisco 2006 - 2009 between Google, Apple, Intel Corp, Adobe and others which has just ended in a $415m settlement, it is obvious that globally we are not pro- ducing enough skilled people. You can say it is merely the operation of the market economy but since this particular battle erupted the poaching frenzy amongst these companies has, this year, reached new levels. Competition for jobs is so intensive that Silicon Valley employers tempt top programmers from rivals with signing bonuses from Apple typically starting at $250,000 for Tesla staff prepared to jump ship. When I was in the Valley two months ago I was told that Oracle was offering some rival staff signing bonuses of £1m. Severe shortages are bad for employers.

We need to take a leaf out of the German educa- tion system which is carefully structured to ensure that people in the system are given the education which they need both as individuals and as members of a bigger society. This means that there is an element of planning which allows educationists to work with government to ensure that German society can avoid major skills shortages. In the recent Cisco Global Cloud Index Study, 60 per Even Margaret cent of data center operations said that lack of qualified staff was one of the major issues they face. Thatcher thought The free market is fine – but even Margaret Thatcher The Chicago eventually came to the conclusion that The Chicago Boys allowed the market too much freedom – we need Boys allowed elements of planning and the present skills shortage is the market too showing us that. • much freedom Bill Boyle - Global Managing Editor @billboyleDCD

46 datacenterdynamics.com • September 2015, Issue 07

Cooling solutions for every scale.

All over the world, our air conditioning solutions keep sensitive data center technology cool, and make sure that information of crucial importance never overheats. Whatever the scope of your requirements, we will plan and deliver your ideal air conditioning solution. www.stulz.com

RZ_150717_STULZ_focusmagazin_UK_anzeige_270x210.indd 1 21.07.15 11:24