<<

April/May 2017 datacenterdynamics.com

Why the human factor is still the central issue

10 beautiful data centers Finding new frontiers Re-shaping servers Here are some facilities that manage to Building a data center is hard They looked solid and unchanging, but get their work done, and still impress us enough. Building one in Angola now servers are being remolded by the with their looks is even tougher creative minds of the industry

Contents April/May 2017

27 ON THE COVER 18 Powered by people

NEWS 07 Top Story Vantage bought by investment consortium 07 In Brief 14 NTT to develop data centers for connected cars FEATURES 24 Tying down the cloud 27 New data center frontiers 37 Re-shaping servers OPINION 40 23 Data centers are fired by a human heart 46 Max Smolaks is a ghost in a shell

REGIONAL FEATURES 14 LATAM Open source IoT protects Mexican bank 18 16  APAC Indonesia ups its game 37 DCD COMMUNITY 40 Dreaming of Net Zero energy The best discussion from DCD Zettastructure, for your delectation 42 Preview: DCD>Webscale and EnergySmart Find out what we have up 24 our sleeves

EDITOR’S PICK 30 Top 10 beautiful data centers Data centers don’t have to be boring sheds. These ten facilities paid attention to their looks, and we think it paid off. Nominate your favorite for a new 16 30 DCD Award!

Issue 21 • April/May 2017 3 HEAD OFFICE From the Editor 102–108 Clifton Street London EC2A 4HW

% +44 (0) 207 377 1907

Let's hear it for MEET THE TEAM Peter Judge Global Editor the humans! @Judgecorp Max Smolaks News Editor mazon's S3 storage We aren't looking just for @MaxSmolax service went down architectural excellence. We want 22 for several hours data centers whose physical Sebastian Moss early this year, features reflect the heart and soul of Reporter taking large parts of the people who build them. @SebMoss the public-facing of power outages Tanwen Dawn-Hiscox AInternet with it. The cause was Software is less visible, but are caused by Reporter given as "human error." it's changing things, as software- human error @Tanwendh But what is this "human error" defined data centers (SDDC) come (Ponemon/Vertiv David Chernicoff we talk about? An Amazon staffer closer to reality. study, 2016) US Correspondent mistyped a command and deleted We can build automated services @DavidChernicoff virtual servers handling index files. based on pools of compute, Amazon has now set safe limits to storage and networks - see the Virginia Toledo Editor LATAM how much capacity a staff member SDDC supplement that ships with @DCDNoticias can remove at one go. I'd say the this magazine. The next step is to human error was in the original match these services to business Celia Villarrubia design of the system. requirements, and deliver what Assistant Editor LATAM In data centers, we collaborate might become known as "business- @DCDNoticias with ever-more-complex and defined data centers." Paul Mah intelligent hardware. This month Hardware is not standing still SEA Correspondent (p18) DCD puts the spotlight on this meanwhile. After many years of @PaulMah symbiotic relationship. standardization, different shapes Tatiane Aquim and designs of servers Brazil Correspondent are emerging (p37), says @DCDFocuspt We want data centers to Dan Robinson. reflect the heart and soul And networks have become a crucial tool DESIGN of the people that build for data centers, as Chris Perrins and operate them colocation providers Head of Design bundle connections Fay Marney to make their own Designer Could beauty help humans live ecosystem stand out amongst Holly Tillier and work in data centers? A good rivals. Martin Courtney (p24) finds Designer working environment could help how these providers are tying down keep staff motivated to handle the cloud. unprecedented change. ADVERTISING This month we feature ten of Builders battle spiders, floods, Yash Puwar the most beautiful data centers we power cuts and armed raiders to Head of Sales know (p30). This year's DCD Awards deliver data centers to emerging Aiden Powell will include a special prize for data markets (p27). Global Account center design, and this month's We believe that these facilities Manager is a first step to a short list, get more users online, and help Peter Judge so tell us your favorites. drive human growth in the process. DCD Global Editor

FIND US ONLINE datacenterdynamics.com datacenterdynamics.es datacenterdynamics.com.br twitter.com/DCDnews | Join DatacenterDynamics Global Discussion group at linkedin.com SUBSCRIPTIONS datacenterdynamics.com/magazine TO EMAIL ONE OF OUR TEAM [email protected]

© 2017 Data Centre Dynamics Limited All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, PEFC Certified or be stored in any retrieval system of any nature, without prior written permission of Data Centre Dynamics Limited. Applications for written permission should be directed to Jon McGowan, jon.mcgowan@ datacenterdynamics.com. Any views or opinions expressed do not necessarily represent the views or opinions of Data Centre Dynamics Limited or its affiliates. Disclaimer of liability: Whilst every effort has been made to This product is ensure the quality and accuracy of the information contained in this publication at the time of going to press, Data Centre Dynamics Limited and its affiliates assume no responsibility as to the accuracy or completeness from sustainably of and, to the extent permitted by law, shall not be liable for any errors or omissions or any loss, damage or expense incurred by reliance on information or any statement contained in this publication. Advertisers are managed forests and controlled sources solely responsible for the content of the advertising material which they submit to us and for ensuring that the material complies with applicable laws. Data Centre Dynamics Limited and its affiliates are not responsible

for any error, omission or material. Inclusion of any advertisement is not intended to endorse any views expressed, nor products or services offered, nor the organisations sponsoring the advertisement. PEFC/16-33-254 www.pefc.org

4 DCD magazine • datacenterdynamics.com

News Roundup

NTT to develop data centers for connected cars Toyota and NTT have struck a collaboration agreement to develop global technology infrastructure for connected cars.

Schneider launches Micro Data Center (DC) Xpress At DCD Enterprise New York, Schneider Electric introduced a micro data center range which ships in two to three weeks, with a variety of IT equipment pre-installed.

Microsoft donates $465m worth of free cloud services in a year Vantage bought by Digital Bridge- Microsoft says it provided $465 million worth of free led investment consortium services to 71,000 non- profit organizations and 350 West Coast wholesale colo provider in Tier 1 markets, “There will be no change research universities in the provider Vantage Data whose customers normally use to our current development past year via its Microsoft Centers has been bought by a 1MW or more at a time, filling plans,” he said. “We will continue Philanthropies charitable arm. consortium of three investors. a suite or a whole floor, while to build out as fast as we can Infrastructure company DataBank deals with smaller due to customer demand. Human error knocks Digital Bridge led the acquisition customers, said Choksi. This acquisition gives us the out Amazon Web of Silver Lake’s company, with “DataBank is a retail-focused opportunity to expand into new Services backing from two pension colocation and interconnection markets, given the deep pockets In late February, an AWS funds: Public Sector Pension managed services provider, of three investors, who have team member mistyped a Investment Board (PSP) and operating largely in Tier 2 approximately a trillion dollars command when debugging TIAA Investments. markets,” he said. ”DataBank’s under management.” the S3 billing process, and “Vantage will be the customer base typically deploys Choksi will remain in his accidentally removed crucial wholesale colocation brand of a small footprint in retail post, while former Digital Realty subsystems, causing internal Digital Bridge,” Vantage CEO colocation space.” CEO Mike Foust, an advisor - and external - chaos. Sureel Choksi told DCD. Digital Vantage has four data centers to Digital Bridge, will join the 33 of AWS’s own services Bridge already owns a data on its flagship campus in Santa Vantage board of directors, were impacted, along with center company, but Dallas- Clara, Silicon Valley, with a total along with Raul Martynek of countless cloud-based based DataBank is focused of 51MW, and extensive building Digital Bridge. applications and websites. elsewhere. plans, which will continue Vantage is a wholesale unaffected, Choksi toldDCD . http://bit.ly/2nhhK9E Vox Box

How do we solve the talent gap? Are lithium-ion batteries really The amount of growth in this industry safe and viable? is huge, and we’re looking for people A lot of you are thinking “I don’t with all kinds of different experience. even want to touch Li-ion batteries But the amount of people coming in my data center! I’ve heard about in from universities is shrinking. Galaxy Notes and hoverboards Students are going into business, they catching on fire.” It’s a different are going into computer science, and tech. It’s a chemistry that gives they are going into programming. you higher energy density. If you We’ve been looking for people with have space or weight distribution experience in different industries, issues, Li-ion might be for you. The Dean Nelson and bringing them in to look at our Peter Panfil initial cost is higher than for a VRLA Head of Infrastructure problems. Vice President, battery, but the CO is compelling. Uber Compute Global Power Vertiv http://bit.ly/2oxJmsr http://bit.ly/2nLwpOj

Issue 21 • April/May 2017 7 News Roundup

Facebook officially announces massive Sarpy, Nebraska data center

In the final weeks of 2016, The Papillion City Council approved a major data center project in Sarpy County, Nebraska on a 146 acre lot. But the identity of the data center’s owner was kept a secret - or at least it was meant to be, with DCD finding a paper trail outing Facebook as the company behind the scenes. Now, the social media giant has finally come out of the shadows. At a press conference at Papillion City Hall attended by Governor Pete Ricketts, State Senator John Murante, Papillion Mayor David Black and Tom Furlong, VP of infrastructure for Facebook, the company officially revealed it was behind the project all along. It plans to open two 450,000 sq ft (41,800 sq m) data center halls, as well as a 70,000 sq ft (6,500 sq m) administrative building. Construction Contractor reveals is expected to take about 18 months. Facebook’s Furlong said that the data center campus has been in Apple’s $50m Nevada the works for years, while Governor Ricketts added that it will create hundreds of (temporary) jobs. An earlier proposal put the number of data center plans construction jobs at 150. The Sarpy data center is the second Facebook has announced this Apple is planning a $50 million data center codenamed year, coming a few months after it confirmed it plans to build a facility Project Isabel at the Reno Technology Park in Nevada, in Odense, Denmark. where it already has one data center, and another in construction, according to reports. http://bit.ly/2oy9qrm A contractor filed an application to build the new 373,000 sq ft (35,000 sq m) facility earlier this month - and then withdrew the application on the same day, according to Fortune. Apple first built Project Mills on the Reno site, in 2012, and has applied to build a new data center codenamed Project Huckleberry there. The contractor then promptly withdrew the application, but it can be resubmitted. No other details are available.

http://bit.ly/2njQ9Wg

Switch opens the Pyramid campus in Michigan

American cloud and colocation provider Switch has opened the first phase of what’s expected to become the largest data center campus in the Eastern United States, located in Gaines Township near Grand Rapids, Michigan. Switch Grand Rapids facility was designed to Tier IV Gold standards and is powered by electricity from 100 percent renewable sources. The first phase of the project has delivered more than 225,000 square feet (20,903 sq m) of white space, located inside the pyramid building that was previously housing the headquarters of Steelcase, at one point the largest office furniture manufacturer in the world. The campus is set for rapid growth, and will eventually offer 1.8 million square feet of white space and up to 320MW of power capacity, at a cost of approximately $5 billion.

http://bit.ly/2obvkR8

8 DCD magazine • datacenterdynamics.com News Roundup

The Kingdom of Bhutan opens Google announces first government data center three new cloud The Kingdom of Bhutan has opened a 2,500 square foot (232 sq m) regions: California, data center at the Bhutan Innovation and Technology Centre in the Thimphu TechPark. Montreal and The facility, designed to Tier II standards, already has 22 government services running on it, which take up 60 percent of the the Netherlands current storage capacity of 50 terabytes. The Nu120 million ($1.8m) facility was funded by India, under Google confirmed plans to launch three new cloud regions - its Project Tied Assistance initiative. For the latest PTA, India has in California, Montreal and the Netherlands. It currently operates committed Rs28,000,000,000 ($430m) in assistance, funding six regions, but plans to have more than 17 locations ‘in the 83 projects, including the e-Governance Program and national future.’ broadband master plan that this is a part of. Each region will have a minimum of three zones - Tied aid is the controversial practice of giving financial aid that geographically diverse data center locations. The search giant must be primarily spent on goods and services from the country did not give a timeline for the rollout of the regions, but it did giving the aid. open a €600 million ($635m) data center in the Netherlands last year. http://bit.ly/2n9QZES This year it plans to launch new regions in Mumbai, Singapore, Sydney, Sao Paulo, London, Frankfurt, Finland and North Virginia. Brian Stevens, VP of cloud platforms, said: “These new regions will deliver lower latency for customers in adjacent geographic areas, increased scalability and more disaster recovery options.”

http://bit.ly/2mJp8Ox

DCD News Online Your daily fix from the industry

AD 182x120

Issue 21 • April/May 2017 9 News Roundup

power ahead of schedule. Aggreko to Aggreko will also supply an additional 4MW as contingency power mystery power for when repair and maintenance is undertaken, for a total off-grid Irish of 18MW. “Data centers are being data center for constructed at such a speed that in some countries the local two years infrastructure just cannot keep up with demand,” Billy Durie, head Glasgow-based power and of European sector and account temperature control company development for the company, said. Aggreko will supply 14MW of gas- “Temporary gas-generated power generated power to a new data center makes perfect sense for data centers in Ireland until it moves to the main that need to be operational before a utility grid network in 2019. connection is available from the local The company behind the data power supplier, or simply where there center has not been named, but DCD is not enough capacity from the grid.” understands that the client has fast tracked a project, causing it to need http://bit.ly/2nwbBs1

Peter’s random factoid Facebook currently consumes around 7.5 quadrillion instructions per second. That’s one million instructions every second for every single person alive.

Data Centre Cooling Your partner for ultra-efficient solutions www.systemair.com

10 DCD magazine • datacenterdynamics.com News Roundup

percent, 26.33 percent and 5 percent of Xin Chinese CDN Run, respectively. “We believe the transaction will ChinaCache to enable Xin Run to achieve growth and unlock value, which may further benefit sell its data center our shareholders. After the transaction, ChinaCache and Xin Run may explore business partnership opportunities to provide our enterprise customers with premium total Chinese provider of content delivery solutions,” Song Wang, Chairman and CEO network (CDN) and cloud services of ChinaCache, said. ChinaCache is to sell a 79 percent stake in In addition to being CEO of ChinaCache, its data center business, ChinaCache Xin Wang controls both Tianjin Shuishan and Run Technology, for RMB221.2 million Tianjin Dingsheng. ($32.1m) in cash before fees and expenses. Tianjin Shuishan, which will become the Xin Run will be bought by Tianjin largest owner of Xin Run, will loan money Shuishan Technology, Shanghai Qiaoyong from Shanghai Qiaoyong or its affiliates to Equity Investment Fund Management Co., finance its part of the acquisition. and Tianjin Dingsheng Zhida Technology Co, with the companies acquiring 47.67 http://bit.ly/2oyU9T0

BASF picks HPE to build 1 Petaflop chemical Ford to build $200m research supercomputer Michigan data center

The world’s largest chemical producer, BASF, will work Ford plans to build a $200 million data center in Flat Rock, with Hewlett Packard Enterprise to build a supercomputer Michigan, its second new facility after its previously for industrial chemical research at the company’s revealed Dearborn, Michigan data center. Ludwigshafen headquarters. Details on the size and contents of the data center are BASF previously turned to HPE for its data center lacking, but Ford said that it expects its data storage needs, in 2015 outsourcing its two Ludwigshafen data requirements to increase from its current 13 centers to the US IT company, and transferring roughly petabytes to more than 200 petabytes in 100 jobs. 2021. “The new supercomputer will promote the application This expected jump in storage and development of complex modeling and simulation requirements comes as Ford tries to approaches, opening up completely new avenues for our rebrand itself as a mobility company, research at BASF,” Dr. Martin Brudermueller, BASF CTO, as its core automobile business comes said. under threat from Silicon Valley’s self The supercomputer will be based on the latest driving endeavors and the rise of ride generation of HPE Apollo 6000 systems, and feature sharing. Intel Xeon processors, Intel Omni-Path Fabric and HPE In 2016, Ford and Baidu invested management software, with an effective performance of $150 million in Velodyne, a manufacturer more than 1 Petaflop. of lidar (light-based ranging) sensors used in self driving cars. The company then announced that http://bit.ly/2mQf0EZ it planned to have a fully operational self driving car - without a steering wheel - by 2021, although it is likely the car would be used only in certain city centers and see its speed limited. Software for the car will be developed in part by Argon AI, an artificial intelligence company founded by the ex- heads of Google’s and Uber’s self driving divisions. Last month, Ford announced that it would invest $1 billion over the next five years in Argon, becoming majority shareholder. “There’s a war for talent out there,” Ford CEO Mark Fields said at the time.

http://bit.ly/2nynyxp

Issue 21 • April/May 2017 11 News Roundup

its predecessor, Honey Badger. automation. It makes it possible for Facebook refreshes The Yosemite v2 compute server is intended everyone to work at the speed of software.” for scale-out data centers. It’s designed so its servers at OCP that the sled can be pulled out of the chassis http://bit.ly/2mbCwrw for components to be serviced, without Summit having to power the servers down. Tioga Pass is a new compute server Facebook has announced a complete with dual-socket motherboards and refresh of all of its servers, and shared details more I/O bandwidth for its flash, of the new range with the Open Compute network cards, and GPUs than its Project, the open source hardware group predecessor Leopard. which it established in 2011. Big Basin is the server that trains The four models include a storage neural networks - it replaces Big Sur, server, two compute servers and a specialist offering more memory and processor appliance designed to train neural networks. power. The Bryce Canyon storage server has Vijay Rao, Facebook’s director a 20 percent higher HDD density and a of technology strategy, said: “Open fourfold increase in compute capability over hardware increases the pace of

Microsoft shows ARM Nielsen’s TV audience data center servers ratings delayed by Microsoft has announced support for ARM based data center outage servers in the data center, including them in the ad latest stage of its Olympus Project open servers, and De g even porting Windows Azure to run on ARM. in The data center of broadcast analytics lk ARM-based chips from Qualcomm and Cavium a specialist Nielsen suffered an outage in W are in two of Microsoft’s new Olympus “universal e mid-March, causing ongoing delays

h

motherboard” designs, available as open source, T in TV ratings for programs aired on a

s

Microsoft’s general manager for Azure hardware ’ number of networks over a weekend.

Kushagra Vaid said. The company also ported M The facility, located at the company’s

Windows Azure to run on ARM - but only for A Global Technology and Innovation internal use, rather than for external customers. Center on 1 Nielsen Way, Oldsmar, “Microsoft has accelerated the timetable for Florida, initially reported an unspecified cloud data center ARM processor adoption,” “technical issue” at 9am ET on Sunday. The commented Paul Teich, principal analyst at Tirias company then told clients it had indeed suffered Research to DCD. a blackout at 11am ET, but the data center power “We are still six months before Cavium and was back on. Qualcomm deliver these chips to market, and this Data was collected despite the outage, but the will get other cloud giants interested. It completely systems had to be rebooted in order to process and scraps the IDC and Gartner forecasts for ARM generate the ratings, causing the delays. Nielsen servers - and it’s a good move for Microsoft.” is yet to explain how a power issue could disrupt Microsoft’s Project Olympus, its open source its servers, which should have had backup power blueprint for hyperscale server hardware, was first in place. announced at DCD Zettastructure last year. 5,600 Ratings trickled out days later, but the delivery The number of remained slow and intermittent. Although delayed http://bit.ly/2okfH6J TV ratings aren’t particularly dramatic for network US federal data ad deals, which are negotiated based on data centers still collected up to a week after airing, the incident remaining after does not bode well for the company, which has seven years of been criticized for failing to adopt to modern consolidation, viewing habits. According to The New York Times, $70 billion down from in advertising dollars is traded in the United States over 10,000 every year solely based on Nielsen’s ratings, but (Government networks appear willing to leave the company Accountability behind if it fails to step up its game. Office) http://bit.ly/2obzh8a http://bit.ly/2nmVfRB 12 DCD magazine • datacenterdynamics.com

The project at a glance

Team • Rogelio Garcia Cabrera, master of computer science • Arturo Benitez Mejia, master of innovation and renewable energy Open source • Dzib Jesus Jose Sanchez, mechatronics engineering • Diego Sabino Hernandez, electronic systems engineer IoT protects

Development period • January 2015 to July 2016 Mexican bank Load monitored • 650kW Grupo Elektra-Banco Azteca is using Raspberry Pi and Arduino to predict and avoid equipment failures. Data center Virginia Toledo finds out more • Facility Grupo Salinas in Mexico City anco Azteca is one of Mexico’s Hardware used largest banks, with operations in six Latin American countries. It • Arduino and Raspberry Pi was created in 2001 as a financial services subsidiary for the retail Advantages firm Grupo Elektra, which was set • Modularity, sustainability, Bup in 1950 by Hugo Salinas Rocha. low cost, high speed of deployment, no licence Virginia Toledo To reduce the risks of failure in the bank’s Editor LATAM required data centers, Grupo Elektra-Banco Azteca’s vice president of systems Manuel Gonzalez came up with the idea of a monitoring system which could detect dangers and warn of possible failures. The bank built a detection system in-house, and launched it first in the Grupo

14 DCD magazine • datacenterdynamics.com Latin America

Salinas data center in Mexico City in early cards, and start transmitting information to a 2015. This project was completed in July database and a real-time display panel. 2016, and the bank plans to roll it out to It is also easy to remove and repair more data centers. sensors and cards, because of their low The system is intended to assess the cost. The cards and sensors are easy to buy quality of energy delivered, the quantity of in electronics stores and, if necessary, the energy consumed and other metrics such as group could make its own cards and be temperatures across the site. independent of suppliers. This is all due to the work of the open source community “One of our main requirements was that which publishes the schematics for free, in these new systems were not independent, an official repository. but quite the opposite. They must be able The arrangement of sensors and to be integrated into a single solution, so circuit boards is not static, as it allows that in the future, one system can give the group to integrate more advanced us information about as many factors as cards without modifying the sensors. For possible,” says Rogelio Garcia Cabrera, example, information can be transmitted director of data centers in using GPRS, but by Latin America for Banco simply changing the Azteca. communications module, The bank analyzed it can be upgraded to 3G several market-leading Data center technology. data center infrastructure This structure also management (DCIM) managers do makes the system more products but none of not need to secure than commercial them met its specific systems on the market, requirements, so the be passive its creators say. There has company decided to create been a surge of attacks its own tool. The result consumers based on unsafe Internet is not a traditional DCIM of Things (IoT) devices, but app, but one tailored to the of products the bank’s early warning company’s needs. anymore and monitoring system The project’s main can be kept safe because innovation was to use open it is possible to update source software, and cheap the firmware remotely. high performance system-on-a-chip (SoC) As a new security breach is identified, it can microcontrollers and microcomputers - be solved and corrected long before all the specifically the Arduino and Raspberry Pi. devices are compromised. Open source is becoming more popular “This information gives us a new in the industry since it allows for constant opportunity to not only detect early, but to improvement, while knowledge and new predict the behavior of the infrastructure,” developments are shared with anyone who Garcia Cabrera says. wants to use them. At the moment the system is being Garcia Cabrera’s team developed modules introduced to more and more equipment which could be connected to the mission- in the Banco Azteca’s facility, and the critical equipment, in order to provide an information gathered will allow the team early event detection system. to identify patterns of behavior. This data, The system developed by the team combined with prediction algorithms, can monitors energy delivered in real-time, provide an accurate model that can foresee determines the energy consumption of equipment failures before they happen. hardware, and gives an indication of the This will lower maintenance costs amount of wear in the cooling systems. It and help avoid unscheduled downtime. also offers a look at temperature levels and Eventually, the system could be fine-tuned humidity inside the site. using the tools of advanced analytics, big “The advantage of a homogeneous system data and machine learning. like this is that it allows us to determine how long a condenser has been overheating,” With this kind of development, data says Garcia Cabrera. The condenser may be center managers do not need to be passive overheating for some time before it causes consumers of products anymore, but can a temperature rise in an area within the start to develop technology according to facility. The useful life of the condensers will their needs, says Garcia Cabrera: “This will be affected by the length of time they are be considered one of the most important running too hot. applications of the IoT. This change is taking Implementation of the system is quick. place in the most advanced data centers In just a few minutes, sensors to measure around the world, and a project like this current, voltage, temperature and humidity will bring on the so-called Fourth Industrial can be installed, connected to processor Revolution.”

Issue 21 • April/May 2017 15 Indonesia ups its data center game

An upgrade to the power infrastructure and new financial data regulations could be just what the country needs, says Paul Mah

ith an “Data centers built in this the fourth floor of the eight story estimated country prior to 2009 are Tier I and building is currently being filled up. population of Tier II. However, companies are Another consideration is 263 million, starting to understand the benefit of stricter requirements for the Indonesia data centers. Businesses that were financial sector, thanks to the is the previously immune to frequent implementation of Government Wfourth most populous country on downtimes are starting to realize Regulation 82 of 2012 (PP82/2012), Paul Mah Earth and the largest economy in they can no longer avoid or pretend which prohibits financial data from SEA Southeast Asia. Data centers there to avoid the importance of data being kept outside the country Correspondent have some ground to make up, but centers,” said Siagian. without prior approval. they are expanding rapidly. The regulation calls for each Facilities built in Indonesia now This is due to developments vertical industry to adopt its own are more reliable, and operators such as e-commerce, on top of set of rules. In November 2016 are working hard to meet booming more frequent Internet use, and the Financial Services Authority demand while complying with upcoming trends such as the of Indonesia introduced its regulations, although sourcing Internet of Things (IoT) driving own sub regulation called IT reliable power and trained staff demand for reliable colocation. Risk Management, that adopts remain key problems, according to In 2014, some 15 percent of the PP82/2012 in its entirety. This will Alvin Siagian, vice president and first two floors of NTT’s Jakarta increase the demand for services director of NTT Indonesia. 2 data center was in use. Now, from local data center operators.

16 DCD magazine • datacenterdynamics.com Asia Pacific

“Data centers [serving the his decades of experience in the financial sector] must also abide IT field, Siagian noted: “Data by banking rules and must meet IT On paper centers used to revolve around compliance rules. A lot of providers M&E (mechanical and electrical are not prepared,” he concluded. a self-built infrastructure) in the past. But running a critical data center today, Indonesia’s power supply is data center we need to understand the IT side unstable, with rolling blackouts in Indonesia of it. Where do you put your server, caused by shortfalls. Uneven and how do you scale it up [or] electricity distribution was seen as can look scale it down?” the main challenge in Indonesia by There is also room for an earlier IDC data center index of cheap, but outsourced data center operators the Asia Pacific region. to improve, too. “Some of your With a country-wide costs are applications, or lines of business, undersupply and an electrification deceptive don’t need a high level of uptime. ratio of just 74.4 percent, the You need to know how you energy situation is compounded benefit your client so that you by Indonesia’s complex geography can scale up and down,” he said. with thousands of islands, and power stations, such as DCI Siagian also pointed to private increased consumption that has Indonesia’s JK1 data center which cloud infrastructure such as surged by as much as 50 percent in has access to the power generating hyper-convergence with its Indonesia the decade up till 2013. plant in Cibitung Industrial Estate. power dense hardware, and The government is working to Costs in Indonesia can be said: “It is about scaling up, and Capital reduce energy and fuel subsidies deceptive, says Siagian. Thanks to yet being able to maintain the • Jakarta to free up more of the budget the comparatively low cost of land uptime.” for infrastructure - and now has and building in Indonesia, it can Population an ambitious expansion plan look a lot cheaper on paper to build It is clear that Siagian • 263 million to build 291 generation plants, a self-operated data center than to is passionate about hard over a thousand substations, and go down the outsourced route. deliverables such as uptime. GDP 47,000km of new transmission In fact, merely tallying up the NTT Indonesia is a member of • $940 billion and distribution lines, according to land and building cost does not the Association of Data Center Energy a report by the Lowy Institute for offer a realistic representation of Providers (IDPRO), which is International Policy. the TCO (total cost of ownership), currently working with the • Indonesia exports Siagian hopes power said Siagian. This is because such government to help design coal, gas and oil. It privatization could ease matters. an approach leaves out the cost of a framework for standards has good resources in At present, state-owned utility ensuring uninterrupted power, the and service level agreements renewables, including Persero-Perusahaan Listrik Negara cost of hiring the skilled staff, and (SLAs) that can be widely used. wind, hydro and (PLN) remains the sole purchaser of the business cost of outages. “I [would] rather work by geothermal power. power output and has a monopoly The importance of hiring offering an SLA, and standing on the transmission system. professionals with the right by it. Tell me what your In the meantime, providers have operational experience cannot be penalty is, and let me prove it options in the form of independent overemphasized. Drawing upon to you,” said Siagian.

Issue 21 • April/May 2017 17 The lights are going out in data centers - so why is there anyone still at home?

Data centers designed for reliability will only actually operate at a high reliability if they are staffed by people who are able to handle the demands, says Saville-King: “You can ​have​ a facility that certified for its Tier ​IV​ ​design​, but it’s at risk based on​ ​ ​the​ ​ people - unless you ​invest​ ​in​ ​training​ ​and​ ​ preparedness​ ​and​ ​responsiveness.”

ata centers are now so fully One simple factor to encourage good people automated that they run is to be where they want to be. Data centers almost unattended. They work better when they are close to people, don’t need people anymore, says Andrew Fray, UK managing director of and their basic environmental colocation provider Interxion: “The only way needs seem to have diverged I can explain it is people exert a gravitational Dfrom those of human beings. Do data centers pull. In this world of amazing technology, it’s still need people in them? still important to have human contact.” For day-to-day operations, IT hardware Fray’s urban data centers couldn’t be is better off in an environment that isn’t closer to people. On London’s Brick Lane, designed around people (see On life support). the key amenities are easy travel, excellent It’s also cheaper to build facilities out of curry restaurants, and 24 hour bagels in town, and close to sources of power. the same street. It seems food is a But data center operators say they factor attracting business from the still need warm bodies. comparative sterility of nearby “No​ ​matter​ ​how​ ​much​ ​ 620k Docklands. investment​ you make ​in​ ​ Even Slough, the out-of- construction,​ ​if​ you didn’t​ ​ people work in town hub which was decried invest​ ​in​ ​addressing​ ​the​ ​human​ ​ data centers as a soulless wasteland by John part​ ​of​ ​the​ ​operation, you could ​ (DCDi, 2015) Betjeman and celebrated by devalue​ ​the​ whole ​investment David Brent, has one big draw for from day one,” warns Paul Saville- humans: its proximity to Heathrow King, managing director of CBRE Data Airport. Center Solutions, a division of the specialized real estate firm CBRE.

Peter Judge Global Editor

18 DCD magazine • datacenterdynamics.com Cover Feature

Advocates of the cloud dismiss these colocation facilities and their customers as “server huggers,” people who are fixated on On life support their IT at the expense of their real business. IT systems don’t want the same things as people. They generate heat, and it’s far more But there are industries that still see benefit efficient to let that heat warm the aisles up, rather than waste energy cooling it down. in installing and configuring their own Downstream of a rack of air-cooled servers, the temperature should be distinctly equipment - and some organizations where uncomfortable for a human being. regulations or policies actually demand it. IT is still changing fast enough that Servers don’t need light to see by, or kitchens, and the only plumbing they need is hardware upgrades are sometimes required. for any water cooling the more advanced systems might be using. As cloud systems And systems running remotely will still have have become more automatic, some data centers have moved to lights-out operation, to be audited. This will mean actual physical shedding much of the life support humans demand. checks on what is in the racks. So data centers still have to be designed Electronics are better off without oxygen, which makes fire and corrosion possible. for human beings to access. A large webscale In an extreme example, Microsoft’s Project Natick has tested a micro data center facility may be virtually automatic, with operating underwater. The rack is in a pod that’s too small for a person to get into, batch-driven hardware upgrades, but there’s even if it had an airlock, and it’s full of nitrogen. still a handful of people there. On the sea bed, Natick operated without human intervention for months on Colocation spaces have more bustle. The end. Any servicing and repairs had to wait until it was hauled up to the surface. space provider, and any networks and cloud operators will need to come in, while end users will need to tend the IT equipment in their racks. With people coming and going every day, a colo campus needs complex security - so that each set of users can always access the kit they need to get to, but everyone’s equipment remains private. Biometric access is normal at all levels from the site down to corridors halls, cages and individual racks. A popular colo will host a virtual community, with business partners making fast links between servers located in the same site (so-called “East-West” traffic). Meeting rooms and office facilities can make this physical, giving executives and IT staff a place to plan their collaborations. There are issues to deal with, however. Data centers are designed around the u

"No​ ​matter​ ​how​ ​much​ ​ investment​ you make ​in​ ​ construction,​ ​if​ you don't address ​the​ ​human​ ​part​ ​ of​ ​the​ ​operation, you could ​devalue​ ​the​ whole ​ investment from day one,” Paul Saville-King, CBRE

Issue 21 • April/May 2017 19 of people required to operate a data center is going down, with thirty people or less running a huge webscale facility. “A lot of people build and commission a data center. Surprisingly few people work inside it,” says Peter Hannaford of specialist staff consultant Datacenter People. Data center companies need C-level executives u needs of servers, so anyone entering that and sales people, but recruitment there goes space will have to watch their step. Raised in fits and starts, he says. Around 60 percent floors and equipment can be a hazard, as well of jobs are for operational engineering staff. as the presence of high voltage circuits. Thanks to the continued growth of data There should be induction courses. You centers, there’s high demand, and staff may be fully aware of the risks, but if a data are being drafted in from other critical center lets you in without a reminder of the environments, such as oil and gas and ground rules, be wary. A health and safety the military. But while demand is steady, violation by a less-experienced visitor or the required skills are changing, observes staffer could lead to an outage for Hannaford: ”There’s a new breed of data your services. center technician, or data center engineer, with multiple skills.” Despite this, human factors There are two major problems programs are still the 2015 with recruiting people for data exception, warns Saville-King. the year when IT centers, according to Dean Training should not just be outnumbered Nelson of Uber Compute. Firstly, on the technologies, but also facilities the people who would be great cover “soft skills,” he says. These (DCDi) for the job aren’t aware of data include communications​ ​skills​ ​ centers, and aren’t applying. And and​ ​assertiveness, both vital when a secondly the skills are not fixed. If you problem develops and cooperation is needed hire someone with deep skills suitable for to deal with it. “We must develop ​people​ ​that​ ​ today’s technology, they will become much can​ ​make​ ​good​ ​decisions​ ​under​ ​pressure,” less useful when that technology becomes he says. “They must be ​properly​ ​trained,​ ​ obsolete, which could happen at any time. repeatedly​ ​drilled,​ ​and given​ ​practice​ ​at​ ​ “Most people don’t know this industry working​ ​under​ varying ​conditions.” exists,” says Nelson, “and we don’t have The overall job is changing. The sector the university exposure, so people aren’t is still expanding rapidly, but the number excited about jobs in this field.” Instead of data centers, students are going into other jobs in business, computer science, or programming. Nelson’s answer to the need for human skill is to professionalize. He set up Infrastructure Masons (see Are you an

"Most people don’t know this industry exists, and we don’t have the university exposure, so people aren’t excited about jobs in this field,” Dean Nelson, Infrastructure Masons / Uber Compute

20 DCD magazine • datacenterdynamics.com Cover Feature

Infrastructure Mason?), a professional body which aims to encourage everyone in the Are you an data center sector to up their game, with an Infrastructure Mason? awareness of the responsibility they have for Dean Nelson, of Uber Compute, says the world’s data and its infrastructure. Infrastructure Masons is a professional The people they are looking for may organization for the people that make not be in data centers right now, says the digital economy work. “We build the Nelson: “We’ve been looking for people underlying infrastructure that enables with experience in different industries, and “The only way I can the Internet of everything. We are trying bringing them in to look at our problems.” explain it is people to bring this community of people Sometimes people from adjacent together to collaborate.” industries bring invaluable experience, as exert a gravitational Christian Belady, head of cloud infrastructure pull. In this world of The profile of a Mason is about strategy at Microsoft, found when he hired amazing technology, experience, economics and a senior executive from the dairy industry. stewardship. Experience relates to Unlikely as it may sound, her experience it’s still important to the Mason’s years of experience in handling a supply chain for a perishable - but have human contact,” hardware, software and infrastructure. valuable - commodity was directly applicable The economics factor depends on in data centers, and her experience in an Andrew Fray, how much capital expenditure and environmental plant also came in useful. Interxion operating expenditure a Mason has been responsible for during his or her Mechanical and electrical systems used mechanical or electrical experts. career. The stewardship factor is about to have their own staff. Now the roles are So data center people all need IT skills the number of industry groups the combined in one person, and the work is now. “If you're looking at an infrastructure Mason has been involved with or had becoming more IT related, for two reasons. job because coding gives you the willies, leadership positions at. “It is easier to learn about mechanical and there's bad news,” says commentator Dan electrical systems than it is to learn the Rosenbaum in a blog sponsored by HPE. Collectively the founders of the group IT bit,” says Hannaford. And the physical “You've got to get with the programming if have hundreds of years of experience, infrastructure is now monitored and you're going to have a career.” and have handled many billions of controlled by data center infrastructure As well as tech, those soft skills are dollars of capital. The group wants to management (DCIM), or a building needed, he continues: “If you think a data encourage more people to reach this management system (BMS), effectively center job is a good place to hide from level of experience. placing the building under IT’s control. people, the news is even worse, because soft “When the service goes down, the power skills are increasingly important as well.” “This is where infrastructure guy will think if there is something wrong It’s such a potentially diverse role that professionals connect, grow and give with the power, and the mechanical guy will smart firms take people with the rightu back,” says Nelson. think about the cooling. The IT person is probably bridging the two.” Join the Infrastructure Masons here If you only have one person in your http://imasons.org/join network operation center (NOC) it should be a generalist with IT skills, who can identify the problem and call in appropriate

Issue 21 • April/May 2017 21 Cover Feature

uintellect and behavior, says Hannaford: the sort of incidents people respond to will “You can change their experience and shift to the more critical end of the scale.” skills. You can’t change anyone’s intellect. One can imagine a situation where the To err is human Behaviors you can modify, but it’s not easy. facility runs itself, and bored operators wait The presence of humans implies the In theory you could hire someone with no for an incredibly rare instance where a possibility of human error. Even the experience.” decision is required. Given the importance most reliable systems will eventually Eventually a lot of this will be automated, and the rarity of those instances, how can fail due to human error, after 150,000 and the number of operational staff may they be trained and motivated? “People issues to 200,000 hours. shrink even further. The construction staff will be even more important in the leaner could also decrease, as firms adopt a modular models of the future,” Saville-King predicts. This figure comes from the classic cookie-cutter approach where identical None of these scenarios are the end for “Managing Risk: The Human Element” rack systems are produced in a factory-like people in data centers, as a standardized (2008) by nuclear scientist Romney environment. process will stagnate. “If you keep doing Beecher Duffey and aviation regulator what you did before, there will never be any John Walton Saull. The figure can be Support and maintenance will increasingly innovation,” says Hannaford. “We rely on reduced by human ingenuity applied be automated and provided by non-human people to innovate.” to the system to minimize the dangers systems, or by remote humans interacting Data centers have already changed of error. through augmented reality or virtual reality, from how they were ten years ago, and “projecting their consciousness” and more changes are afoot. Some facilities are The physical design of a data center operating robots where they are needed says throwing out features like raised floors and should include basic things like Saville-King. 600mm floor panels. “If a smart engineer electrical safeguards, and clearly You might expect that fewer humans designed a data center from scratch, it would labelled emergency features, and the will increase the reliability of the system as look totally different to what we have,” says operator needs to make sure anyone human mistakes are programmed out of the Hannaford. in the building understands these. system (see To err is human). But this might Humans are driving that level of change, not be the case, warns Saville-King: “If the and humans are needed to handle it. Data A junior staff member was the last resilience is built into the technology, then centers will always need creativity. out of one facility, and found that his swipe card didn’t work at the exit. Hoping it would open the door, he pressed a button. It turned out to be the emergency shutdown. He was left trapped in a dark building full of idle equipment that was losing money for customers.

Higher up the stack, human error can blight software design and operation. When the Amazon Web Services storage function, S3, failed for four hours in February, it caused a cascade of failures in web services that rely on S3. The cause was ultimately revealed as “human error:” an admin tidying up some unused virtual servers had mistyped a command and deleted vital resources, leaving S3 unable to function.

Of course, the real human error was in the design of the system that made that command possible, allowing admin tasks to have more privileges than they need - and outside of AWS, those websites and services that rely on S3 have designed in a single point of failure. That human failure cost businesses an estimated $150 million.

22 DCD magazine • datacenterdynamics.com Opinion Data centers are fired by a

n

o i human heart x r e t n I , Whatever happens to the technology, human D M K relations are still central to innovation U | in the data center, says Andrew Fray ay Fr ew Andr

ith the world a single facility, colocation lets mesmerized organizations develop new ideas by new more easily and quickly. For the advances like price of an optical fiber cross- the Internet connect, businesses can share of Things and data, capabilities and services with Wcyber-physical systems, it is easy to a ready-formed hub of potential Access to the forget that people, not technology, suppliers, partners and customers. right people is are the real driving force behind Colocation creates a rich innovation. marketplace platform where we can just as important Long before we had the cloud, do what we all love to do: innovate. as access robotics, or software-defined Data centers are not simply networking, rapid urbanization somewhere to store your IT kit to the right ignited an innovation explosion or your data; they rely on trusted technologies. by concentrating varied industries, human relationships to map out people and ideas into one place. strategies that can flex and adapt Clever staff 18th century London became with our rapidly changing world. remain central the center for commerce and Inside the data center, access to to a high-quality creativity because the city was the right people is just as important a rich mix of different outlooks, as access to the right technologies. experience skills and approaches. As the Automation is on the rise, but number of people, businesses and skilled, clever staff remain central to opportunities in London boomed, a high-quality experience. the city attracted even more talent With more than 60 percent of and investment, creating a gravity data center operators concerned well of exponential innovation. about a lack of suitably qualified In London today, I see people, colocating in urban areas colocation data centers as ‘digital provides access to scarce skills. cities,’ which have become a focal Out-of-town colocation can offer point for 21st century innovation. upfront cost-savings, but can also By uniting diverse digital business sever vital relationships. in a common marketplace, In an increasingly global world colocation enables innovative intellectual gold is still being mined people to capitalize on rocketing in city coffee shops and restaurants, data creation and connectivity. just as it was in the 18th century. Customer communities are Even if the robots start to take emerging around these shared over, innovation will still be driven challenges, as the sheer ‘gravity’ by clever disruptive minds gathered of dense data, connectivity and at the center, not the periphery. human expertise draws companies While data centers provide us with towards the center. a technical platform, it pays to With a myriad of different remember that the data center has businesses linked together inside a human heart.

Issue 21 • April/May 2017 23 he slow but sure migration of enterprise applications and services into the cloud is having a significant impact on demand for data center capacity and Tying down colocation, and operators are Tbeing forced to rethink their business models. Many have identified the interconnects which link one data center to another as a key area of growth – the chance to occupy a critical the cloud position in the cloud service chain which leaves them less likely to be bypassed in favor of direct enterprise connections into facilities Data center companies hope that private/hybrid owned and operated by the likes of Amazon interconnects will guarantee them a place in the Web Services, Microsoft, Google and IBM. cloud services ecosystem, says Martin Courtney

Equinix Cloud Exchange is a case in point. It is a platform which effectively replaces the direct virtual private network (VPN) links which customers might make. Instead, Equinix hosting customers get a direct link to the cloud provider using AWS Direct Connect or Microsoft Azure ExpressRoute, for example. Cloud Exchange is touted as taking much of the complexity and management overhead out of those direct relationships, particularly where Martin Courtney a single customer might have multiple direct Freelance analyst connections to many different cloud service providers. It also delivers a performance advantage compared to routing traffic from the customer directly to the CSP via the Internet, said Mike Winterson, managing director of Equinix services, with an additional layer of security providing extra protection against distributed denial of service (DDoS) attacks. Cloud Exchange does appear to be gaining some traction with customers, currently handling over 1,500 direct connections involving hundreds of thousands of physical connections, ports and switching equipment. For the moment, it handles direct connections into AWS, Microsoft Azure, IBM SoftLayer, Google Cloud, Oracle FastConnect as well as ServiceNow, WorkDay and Salesforce.com, but Equinix is planning to add other providers in the future. Some would argue that puts Equinix firmly in the role of a cloud broker. But rather than moving towards being an aggregator itself, the company is working with other service providers and systems

24 DCD magazine • datacenterdynamics.com Colo + Cloud

integrators that want to offer Cloud Data4, based in France, is also aiming Exchange capabilities under their own squarely at enterprise demand for hybrid brand. One example is Beeks Financial IT architecture, targeting early customers Why data centers want a seat Cloud, a UK-based company offering in financial services and manufacturing at the hybrid cloud table various IaaS, colocation and connectivity industries with high performance Analysts predict significant services to the financial sector, with Equinix computing (HPC) requirements that scale expansion of hybrid cloud, and/or the estimating around 300 are deployed globally out private cloud workloads to AWS’ public implementation of hybrid IT initiatives addressing additional markets such as cloud platform on demand. that rely heavily on different types of healthcare and insurance. “We are positioning ourselves at the cloud architectures, amongst public and “We are unlikely to move to a world convergence point between the two – a private sector organizations over the where [Equinix] can address huge numbers place where customers can develop and next few years. of companies in the enterprise space, so build their own hybrid IT securely and In its report ‘Market Trends: Cloud we need to develop relationships economically and have access to a portfolio Adoption Trends Favor Public Cloud with others who can use the of multiple [cloud-hosted applications With a Hybrid Twist,’ Gartner outlines lego bricks from AWS and] services,” said Data4’s head of its belief that increased use of multiple and network service marketing Loic Bertin. public cloud providers combined providers,” said To that end, with growth in various types of cloud Winterson. Data4 signed a services will create a multicloud partnership environment and a need to coordinate Digital Realty deal with third cloud usage by utilizing hybrid launched its party cloud architecture, for example. Service bn brokerage Elsewhere, 451 Research (‘2017 $91.7 Trends in Cloud Transformation’) predicts that CIOs will accelerate their predicted size of hybrid use of AWS and other public cloud services. These will be tied into a cloud market in 2021 ‘blended cloud strategy’ that does not lock companies into a single vendor (MarketsandMarkets) or hosting location but matches Exchange application requirements, workloads late last year and service requests to the best provider in partnership with Australia-based InterCloud in January this year. The and data center venue. software defined networking (SDN) agreement is designed to give Data4 Serious obstacles may delay those interconnection provider Megaport. Like customers direct access to hundreds of hybrid initiatives, however, not least Cloud Exchange, Service Exchange is cloud service providers on a global basis, of which are integration challenges, aimed at customers looking to implement all facilitated via a single virtualized data application incompatibilities, the large scale hybrid cloud deployments center fabric spread across Data4’s 14 absence of management tools and which span multiple cloud providers hosting facilities across three sites in common application programming and locations in different countries, France, Italy and Luxembourg. interfaces (APIs) and limited vendor but need a middleman to deliver secure InterCloud’s Cloud Delivery Platform support. interconnects between public and hosted currently connects to over twenty IaaS If Equinix, Digital Realty, Data4 private cloud architecture, better control providers including AWS, Microsoft and others offering cloud exchange and more dynamic provisioning and Azure, Google Cloud Platform, and IBM or brokerage services can convince configuration for the kind of hybrid SoftLayer and in excess of thirty SaaS IT departments they can help networks they need. vendors including Salesforce.com, Office overcome those problems, their own Service Exchange provides direct 365, Box, ServiceNow and Cisco-owned transformation from colocation and access to the major IaaS players, Webex. Again, Data4 does not provide hosting providers to indispensable including AWS, Microsoft and Google, the cloud resources itself but delivers the enterprise cloud infrastructure partners and is working with additional SaaS interconnects between the data center and looks assured. players to bring their applications into the cloud gateway using existing points of the fold. Digital Realty chief technology presence (POPs). officer Chris Sharp (who was formerly “Of course, the CIO could buy a direct responsible for cloud strategy at Equinix) connection to each data center where the “We had a serious look at whether says cloud service providers themselves required cloud resources are hosted, but cloud was friend or foe about three to four can use data center companies as that is quite complex and he or she would years ago, from the board level down,” said a channel to reach more enterprise have to speak to many different providers,” Equinix’ Winterson. customers, saving them time on go- added Bertin. “They would have to sign In the end, he concluded that this is a to-market initiatives and helping them a dedicated contract relative to each major change and we are all along for the deliver private cloud services quickly. connection, in each country.” ride: “The IT industry goes through hype “We never want to look like a cloud All three companies acknowledge that cycles on a regular basis – the ASP market broker, that is negatively viewed by all continuing incursions from super scale 16 years ago was a precursor to the cloud parties because the broker gets caught cloud service providers are impacting that never took over. in the middle of whatever deal they have their traditional colocation business, but “But cloud is different, you cannot stop negotiated,” he said. “But we spend a lot of see opportunities for peaceful coexistence it, you just have to transform with it or you time on the service processes to make all and indeed mutual success through subtle will be sitting very uncomfortably in a few these clouds interact.” repositioning. years’ time.”

Issue 21 • April/May 2017 25 > Webinars | 2017

ON DEMAND Making the Case for Lithium-Ion Batteries in the Data Center

Speakers: Peter Panfil, VP Global Power, Vertiv Tony Gaunt, Senior Director, Vertiv Asia Tom McKinney, Director, Forsythe Data Centers Stephen Worn, CTO, DCD

If you missed our recent webinar with Vertiv that was packed full of insight, best practices and key takeaways, then don’t hesitate to watch the recording. An essential listen for data center managers globally. u Watch here: http://bit.ly/2oDZvgt

ON DEMAND Is Hybrid IT the new holy grail?

Speakers: Adam Levine, CCO, Data4 Bruce Taylor, EVP, DCD

On 8 March, Adam Levine from Data4 and DCD explored the decisions needed to ensure your company is hosting your IT assets in the right place.

To make hybrid IT work effectively requires a level of computing capacity, connectivity, efficiency and resilience that most enterprise data centers simply cannot reach without prohibitive investment.

u Watch here: http://bit.ly/2n6DVEg

For more information please contact [email protected] Design + Build

s companies around the world fight for business Finding new opportunities in established emerging markets such as India or China, others are looking further afield. But Athose wishing to do business in frontier data center markets will face issues including harsh Sebastian Moss Reporter climates, limited infrastructure, and poor levels of security. frontiers “If you do a project in Chad, it's 50°C (122°F), you need to have the right type of guys on site,” says Tomas Rahkonen, CTO Building a data center is hard. Building of Flexenclosure, a firm which specializes a data center in Angola is harder, says in setting up data centers in tough environments. Sebastian Moss “You need to have people who are used to solving problems under those conditions.” Flexenclosure has built its prefabricated modular data centers in countries such as Chad, Angola and the Ivory Coast. The key to pulling this off, he says, is planning, long before the equipment is shipped to its destination. “There's thousands of details in a data center, and you can't fix stuff easily at the site. If you've missed something, forgotten something, done some bad planning, or not brought enough materials, it's difficult to go down the road and get it.” Sometimes there aren’t even roads, and the modular facilities have to trek across continents to get to their destination. u

Issue 21 • April/May 2017 27 u “With our delivery in Chad, we had four and a half meter wide modules that went through half of Africa,” Rahkonen says, 1,700km talking about the country’s first data center, a The distance $6m, 374 sq m (4,025 sq ft) installation. Flexenclosure's Nick Arvanitis, global marketing director, modules traveled adds: “A module on the back of a truck can from Doula, get through some pretty rough terrain. Cameroon to “But the challenge of getting somewhere N’Djamena, as far away from a seaport as Chad is, Chad isn't just infrastructural, it's also security. Particularly in sub-Saharan Africa, it's another level of challenge that one encounters in developing worlds.”

Rittal also makes modules, and its director of international products, Jason Rylands, agrees: “There can be security concerns. In things I've done in places like Papua New Guinea you need to make it bullet proof just because the local people will take shots at stuff.” Transport companies and telcos often Flexenclosure's turn to security firms to protect their facilities in Chad, investment, but armed guards can do little Paraguay and Angola against the elements. “It's not that it's very hot or dry there, or it's terribly humid there,” says Flexenclosure's “You have to charge and discharge every emerging market department at the end of Arvanitis. “In many countries it can actually day basically, and lithium-ion batteries are 2016. be extremely hot and dry at times, and then okay, if you don't completely discharge them. But while the market may have its own extremely wet with monsoon rains and But lead-acid is not very fit for too many unique challenges, people everywhere are flooding, and then extremely humid.” cycles of charge and discharge, so emerging the same. “As a country develops and more For Chad’s record heat levels, the markets turn to lithium.” people become middle class," Rittal’s Ryland company "specifically designed the Rahkonen says it can be that the facility says, "they want phones and Facebook, and cooling system to cope with the high daily “pretty much runs constantly on diesel it's putting a massive strain on the networks.” temperatures,” CEO David King says. because the grid is too bad, like in Liberia.” Last year, Kenya's 5.3 million users only By contrast, the Caribbean is particularly But before a facility can even begin to start needed about seven or eight racks, says humid, “and that can cause condensation,” literally burning cash away, one has to raise Rahkonen: “It's quite small compared to Rahkonen says. “You don’t want hot outside the funds to finance the development. developed markets, but the driver is there.” air hitting cold air.” “You can have entrepreneurs that are That changes things. “Suddenly it's a Because of this the company prefers “to do in the countries with a business idea, but Western world spec, because if Microsoft penetrations to the modules perhaps they lack wants to put their stuff in a data center for cables from below.” a bit of cash,” in Africa, they don't want to compromise He adds: “We usually Rahkonen says. “So with their service quality, they want the have the modules on there is quite often redundancy, high ceiling heights, and raised plinths, because Some facilities a need to be good whatever they require as part of their typical there's quite commonly at working with the colo specification.” flooding risks as well.” run constantly credit agencies and He adds: “It's actually these anchor clients And then there are the on diesel, helping out with that shape the spec of the data centers. That's seismic issues - “we have financing.” the market logic.” to construct the place to because the Help is exactly survive an earthquake,” what a lot of In that market, telcos are leading, says Rahkonen says. grid is too bad companies in these Rahkonen. They seem to be the most excited, “Also insects,” Rylands markets may need as seeing a new high-growth market. adds: “Insect infestations, “their capex is denominated And those telecoms companies have you get some really interesting insects that in US dollars, and the local currency is going had the luxury of watching their Western like to nest in certain things.” down against the US dollar everywhere, so counterparts fail to capitalize on colocation. Even if the data center is ruggedized, the cost is actually increasing,” Zhilei Zou, “I think the opportunity is out there for them reinforced and ready to go, areas with president of Huawei's carrier business group, to take this role and become the colo because unstable grids can be a challenge. says, referencing the surging dollar. their brand is so strong and it's a technology “They will need higher capacity UPS Despite this temporary difficulty, the brand,” Rahkonen says. batteries, so actually lithium-ion batteries market is growing rapidly, and will continue “The big names like Interxion and Equinix are being adopted in those sites,” says Victor to do so, Wei Peng, VP of CBG at Huawei, and Digital Realty - they're not known in Cheng, VP and GM of Power System BG, believes - and in response, the Chinese these markets, but Zain is known, Tigo is Delta Electronics. networking giant set up a specialized known, Millicom is known, and so forth.”

28 DCD magazine • datacenterdynamics.com

Peak experience 1 The Switch Pyramid, Michigan

Peak experience u The Switch Pyramid, Michigan All of Switch’s data centers have a trademark exterior and interior style based on patented power and cooling Top 10 beautiful systems. The Pyramid site near Grand Rapids also includes an adaptive reuse of an iconic building. The seven-story steel-and-glass structure was originally created by Steelcase as a design center. Switch has opened a 225,000 sq ft (21,000 sq m) data center built into the lower two floors. With other buildings alongside, Switch data centers will create a campus wih up to 320MW of renewable energy, 1.8 million sq ft (170,000 sq m) of data center space, and 435,000 sq ft (40,000 sq m) of disaster recovery office space. Data centers should look good - and this year, DCD Villain’s lair Awards will introduce a new Design category, which v Bahnhof Pionen Bahnhof’s data center in Stockholm pioneered the creative honors facilities that enhance their environment. Here data center movement, when it opened in 2008 in a former are ten of the best-looking data centers we know about. nuclear bunker. Pionen's design is consciously based on a James Bond bad guy’s crib, and makes several references to the 1970s movie Silent Running. Built by Jon Karlung If you know other sites that match these, tell us! and briefly the home of Wikileaks, it has backup power from diesel engines designed for submarines, and features [email protected] waterfalls, a salt-water fishtank, and plants growing under simulated daylight.

30 DCD magazine • datacenterdynamics.com Peak experience The Switch Pyramid, Michigan Kristina Sahlén / Bahnhof AB v Switch u Villain’s lair Bahnhof Pionen

Sources: 2

Issue 21 • April/May 2017 31 Building altar-ations Barcelona Supercomputing Center

Building altar-ations w Barcelona Supercomputing Center Opened back in 2005, the Barcelona Supercomputing Center is built in a former 19th century church. The Torre Girona was rebuilt after the Spanish Civil War and is now part of the campus of the Polytechnic University of Catalonia. Now it holds the MareNostrum supercomputer, a joint venture built by IBM and the Spanish government. For a time it was one of the world’s fastest machines. It may not hold that claim anymore, but it’s still one of the best looking. datacenter.navercorp.com datacenter.navercorp.com

>Awards y

Beauty is subjective, and these architectural gems might be eyesores to you. Do you like rigid functionality, decorative embellishment, or clever repurposing? As datacentermurals.withgoogle.com part of our campaign x for better looking facilities, we are calling on the DCD Community to decide which sites look the finest.

There will be a public vote in this year's DCD Barcelona Supercomputing Center

Awards, for our first w ever prize for data Building altar-ations Barcelona Supercomputing Center center design. 3 Sources:

32 DCD magazine • datacenterdynamics.com Google's Data Center Mural Storage chest of data 4 Google, Oklahoma 5 Naver

Google's Data Center Mural x Google, Oklahoma In Google's Data Center Mural project, launched in 2016, murals represent data center activity on their outside walls. In Oklahoma, artist Jenny Odell found man-made features in Google Maps' satellite images, gathering views of swimming pools, circular farms, water treatment plants and salt ponds. Working from giant cradles, 15 painters used 400 colors, transferring the images to the wall using a chalk tracing technique similar to that used by Michelangelo in painting the ceiling of the Sistine Chapel in Rome.

Storage chest of data y Naver Naver, Korea’s leading web portal has a data center at the foot of Mount Gubong in Chuncheon, Gangwon Province, which stores its customers’ online content. It’s called Gak, after Kyujanggak, the royal library of the 18th century Joseon Dynasty, where Buddhist documents are stored on wood blocks. The building incorporates traditional design elements, Google's Data Center Mural as well as cutting-edge environmental techniques such as reuse and Google, Oklahoma recycling of rainwater.

Issue 21 • April/May 2017 33 Under the mountain 6 Green Mountain, Norway

Under the mountain z Green Mountain, Norway A retired NATO ammunition store at Stavanger was reimagined as a 146,000 sq ft (13,600 sq m) data center by Green Mountain. The Tier III reliable underground site is inside the mountain, leaving the landscape unspoilt. The site has abundant green energy from two hydroelectric sources. Year-round cold water at 8°C (46°F) from the fjord below cools the servers, using LuxConnect ~ a duplicated circulating system for reliability. The site also has a AQL strong aesthetic. Inside, tunnels } are carved from solid rock, and outside is a green and peaceful mountainscape, above the fjord. Digital Realty

Factory reconditioned | { NGD, Wales NGD’s Newport data center was NGD opened in 2010 in former LG semi- { conductor plant, a property that had stood vacant for more than a decade. The data center space has been expanded repeatedly within the building. All the power needs are supplied by the nearby Dinorwig hydroelectric facility. greenmountain.no / Knut Bry z Factory reconditioned NGD, Wales

7 Sources:

34 DCD magazine • datacenterdynamics.com "Data centers have a job to do, but there's no reason they shouldn't look good as well. Tell us your favorite data center so our Awards can recognize the world's best facilities!"

George Rockett, DCD CEO and co-founder

[email protected]

Towering ambition | Amsterdam Data Tower Digital Realty took over Telehouse's AMS 1 shortly after it was completed in 2016, opening it as the Amsterdam Data Tower. The 72m-tall building has 5,000 sq m (54,000 Towering ambition sq ft) of data space on 13 floors, and 9MW of power. The building uses outside air and Amsterdam Data Tower groundwater for cooling, and also stores warm water underground. The building was 8 designed by Rosbach Architects of the Netherlands.

A view from the gallery } Salem Chapel, Leeds AQL built its headquarters in Salem Chapel, the only surviving 18th century dissenting chapel in Leeds, UK. Opened in 1791, with seating for 1,000 people, the chapel closed after more than 200 years in 2001. AQL turned the ground floor of the chapel into colocation data center space with a glass roof. Above that, the chapel's balcony was refurbished as a conference auditorium. The British government launched its "Northern Powerhouse" program there in 2016, but for most people, it is better known as the place where Leeds United football team was founded.

Grass on the roof ~ LuxConnect, Bettembourg Luxembourg is a popular location for reliable data centers. LuxConnect's DC1.3 has multiple Uptime certificates giving users the option of Tier II, Tier III or Tier IV reliability. The 59,000 sq ft (5,500 sq m) building uses free cooling, and has access to all-renewable power. The building has steel mesh walls which double as a Faraday cage for security, and a grassed roof to reduce environmental impact.

A view from the gallery Grass on the roof 9 Salem Chapel, Leeds 10 LuxConnect, Bettembourg Issue 21 • April/May 2017 35

Advertorial: MDEC

the UK while the asset base in New York has shifted from the five boroughs to less densely urbanized areas of the Tri-State. There are various reasons for this – the availability The changing and price of suitable space; sites that are easier to secure and protect outside the city; legislation that may restrict central city data center development (such as storage shape of data of diesel, for example); greater competition for key resources (power, water, IT skills); higher air pollution; older buildings and center markets infrastructure, and, increasingly the fact that housing a full data center (as opposed to a micro or edge processing unit) in a central Data center hubs need to develop flexibility and scalability city location is unnecessary. in order to remain an attractive investment proposition Hub cities have therefore developed hinterland zones and not necessarily in the same legislature. For Hong Kong, this is the Pearl River Delta in Mainland China uch debate has focused The global cloud providers with their while major developments in Benelux have on ‘hub’ markets as major hyperscale facilities are able to establish moved outside the Amsterdam urban area. cloud, colocation and hubs well away from densely populated Singapore is supported by the Iskander managed service players areas in order to benefit from available space, development zone in Johor, Malaysia seek out new sites across sustainable sources of energy, connectivity immediately to its north. It is likely that in the the world from which to and incentives for investment. The growth future hub cities will need to develop or have Maccess and service customers. Hub markets of very large, power-hungry hyperscale data access to Real Estate in which to expand in have been characterized by a level of data centers to support the growth in demand order to maintain their premium IT position. center asset development and of investment from colocation, cloud and other data center It is obviously important for the hinterland to disproportionately large in comparison to the services has created significant growth offer a similar quality of design and operation IT requirements of the local economy. They in markets such as Ireland, the Nordic to the hub, to offer suitable connection present an economy which supports and countries and US states in and adjacent to the into the hub and a legislative environment draws from IT, and which enables them to Rockies. Thus the location of the hub for the consistent with it in order that the two sites attract regional head offices. They offer very hyperscale data center has moved from the can offer suitable levels of integration and urban to the outer city and latency for the transfer of data, for back up, then to the wilderness, and for service activation, for network security The Growth of Mega-Facilities: this trend will continue as and for efficient portfolio management. Estimated Global Capacity (GW) by Size these mega-facilities increase in number. This change in In the longer term, the role of the hub city part has been enabled by and the remote hyperscale data centers will be the increased number and reinforced as data processing moves towards reach of fiber connections. the edge of the network. The ‘edge’ will tend Interconnectivity – access to towards the greater number of people and multiple network providers, to connected devices in urban areas while the the multi-cloud, to dark fiber, core cloud facility will remain away from the cross-connects and meet-me data sources. The proximity of the core facility rooms etc. – is now enabling will not be an issue since one of the key hubs well beyond urban areas. drivers of edge computing is to reduce latency by processing and acting on data locally and Yet the role of the immediately and as network speeds evolve traditional hub market will with the development of technologies such as not disappear. Hub markets silicon phonetics so the requirement for urban high levels of skills, of connectivity and have are urban in order to provide the necessary hubs to be supported by connected hinterland an active policy of supporting and attracting economic momentum to support a data zones will increase also. foreign investment. The shift in economic center sector, and it is no accident that hub weight from the established towards ‘emerg- cities tend also to be economic hubs at the ing’ markets has positioned hub cities such intersection of trade routes. Local, urban as Singapore, Hong Kong and Dubai as portals data centers will be required to house highly into South East Asia, China and the Middle latency sensitive applications, such as East respectively. A hub market is not a data synchronous replication; and also where a center park or campus (although these may particular location is required for compliance form part of the hub) since these are local with data residency regulation. development sites, not complete economic However, hub cities tend now to be Contact Details entities in their own right. following the patterns established by other Phone: +603 8314 1854 Changes in the global configuration and urban centers. The growth of data centers Mobile: +6019 665 6588 role of data centers have altered the profile of in London has been outside the M25 along Email: [email protected] the hub. the motorways connecting to the rest of

Advertorial: MDEC Servers + Storage

We made some servers - think you can do better? Let us know @DCDnews

ervers are the linchpin of the modern data center, but the server market is in flux at the moment as trends such as cloud computing, social media and analytics are changing the demands on Re-shaping data Scompute power, plus there is a growing need for greater energy efficiency and flexibility. The past year has seen something of a slowdown in server shipments worldwide, center servers with recent data from IDC indicating a decrease of 3.5 percent to 2.55 million units in the fourth quarter of 2016, while Gartner said They looked solid and unchanging, but now that worldwide server shipments grew by just 0.1 percent during the entire year. servers are being remolded by the creative minds of the industry, says Dan Robinson This has been attributed to reasons such as economic uncertainty and some customers re-evaluating their hardware provisioning criteria. Another factor is the uptake of public cloud services, with some organizations moving service provisioning to the cloud instead of running them on their own systems, leading to them requiring fewer servers while the large service providers operating those clouds need more. “The dynamic of hyperscale being Dan Robinson the segment of customers that are doing Freelance more purchasing and building more capacity is true to an extent,” said Adrian O'Connell, research director for data center infrastructure and management at Gartner. However, he added that enterprises currently still represent the largest part

Issue 21 • April/May 2017 37 of the server market, and the move does of years is density-optimized servers, which the current server market is Huawei, which not indicate a mass migration to public combine some of the features of the blade saw a 64 percent increase in shipments for cloud, more a gradual drift. and rack approaches. These typically provide the fourth quarter of 2016 compared with “It’s more a case of choosing which multiple server nodes in a 2U or 4U rack- the same period a year earlier, according to workloads might be going into the cloud and mounted chassis, often with the flexibility Gartner’s figures. This saw it overtake Lenovo looking at how to introduce more efficiency to mix and match different modules to meet to become the third largest server vendor in terms of your on-premise infrastructure. So customer requirements. worldwide, thanks to aggressive pricing, but it’s a steady shift into the cloud rather than a For example, Dell’s PowerEdge FX offers also its existing sales channel partnerships. sudden swing from one to the other,” he said. customers a range of compute and storage “Huawei has the advantage of its modules, including two-socket and four- networking business and the routes to These hyperscale customers tend to be socket server nodes, plus one that comprises market it has established with that, and the running somewhat different hardware to four separate nodes in a single module for reputation it has thanks to those, and proven other customers. The stereotype image of a high density configurations. capabilities around support and after-sales data center crammed with row upon row of However, it is the rack-optimized format service,” said O'Connell. racks stuffed with standardized “pizza box” that still accounts for the largest share of the rack-mounted systems is not far from the server market, typically around half of all truth, but the differences are on the inside. systems sold. “The hyperscale buyers like AWS, Azure, “The 1U, 2U, 4U rack-optimized systems, ARM and Power: taking Google and Facebook, they tend to buy they are still a pretty big chunk of the market, on Intel-based servers these more commodity-oriented systems, so a lot of enterprises, and a lot of the more Data centers are currently that have the extraneous features that you traditional service providers, the tier 2 and dominated by x86 systems, but typically get in an enterprise server stripped 3 service providers, are still buying quite a this architecture has been seeing out of them,” said lot of rack-optimized renewed challengers of late, from O'Connell. systems,” O'Connell chips based on IBM’s Power and the This means said. ARM architecture. discarding any frills Open rack designs This includes The formation of the OpenPower such as embedded hyperconverged Foundation opened up IBM’s lifecycle controllers are like blade servers infrastructure (HCI) architecture to enable partner to deliver a bare done better: shared systems, which are vendors such as Tyan, Supermicro bones system that a small but growing and Wistron to build and sell their is standardized and power and cooling, part of the overall own systems based on Power keeps costs to a server market. These processors. These are claimed to be minimum. However, with no lock-in are appliance-like able to deliver higher performance this does not mean systems that use for demanding applications than these are second internal direct- Intel-based alternatives, as well as rate, as they are attached storage in a offering more performance at a typically stuffed with large cluster of nodes to create a shared pool of given price point, according to IBM. amounts of memory and storage, and are designed to scale by simply Meanwhile, Microsoft surprised the latest processors. adding more nodes. everyone by showing off Windows- Enterprise Hyperscale customers, however, are based ARM servers at the OCP customers, in contrast, able to use their bulk purchasing power to Summit in March. These are for tend to operate a more specify custom designs that meet their exact Microsoft’s internal use only, but mixed environment, and thus have a more requirements. The Open Compute Project demonstrate that it is feasible to run varied estate of servers comprising rack and (OCP) started by Facebook, for example, Microsoft’s cloud services on ARM blade systems, and even some tower chassis provides a set of specifications that large systems. servers. customers can take to an original design Gartner pours cold water on Blade servers can trace their roots back manufacturer (ODM) for manufacture. the notion of any upset happening to early attempts to pack more compute A similar scheme, Open19, has been started soon, however. power into a given space. Typically, this is by LinkedIn, aimed at meeting the needs “When it comes to ARM or accomplished by tightly packing multiple of smaller operators and enterprises, while OpenPower, our position for a good server modules or ‘blades’ into an enclosure Microsoft has developed its own specifications few years has been one of there that provides the power, cooling and that are also available via the OCP. being a lot of potential, but it being connections to the rest of the infrastructure. Ironically, some of the specifications no more than potential right now,” Each enclosure then mounts into a data produced by the OCP have been likened said O'Connell. center rack. to blade servers but done better, because “Is the price, performance, or However, while blade systems account for they stipulate shared power and cooling for power issue sufficiently compelling? about 20 percent of overall server spending all the systems in a single rack, while the It isn’t good enough to just have according to Gartner, there is little or no specifications are not tied to one vendor. parity with [x86], there has to be standardization in their design. This means However, the vendors offering these a significant reason for users to that enclosures produced by one vendor often do not provide the same kind of pre- move their infrastructure away to cannot be fitted with server blades from sales and post-sales support that customers this alternative thing, and we’re not another vendor, leading to a risk of lock-in expect of a traditional server manufacturer, convinced there is this compelling for the customer. according to O'Connell. Again, this may not benefit yet,” he added. Meanwhile, another format that has been be an issue for those hyperscale operators. growing in acceptance over the past couple Finally, one of the surprise winners in

38 DCD magazine • datacenterdynamics.com

Dreaming of

Peter Judge Net Zero energy Global Editor

Can data centers cut their reliance on the grid, or even become net contributors? Peter Judge heard the arguments

ata centers have been cutting “We spend between $30 million and $40m DCD Zettastructure, their energy for some time. on electricity in EMEA every year,” said Doug London, 2016 Can they go off-grid and Balchin, director of critical infrastructure become independent - or and utilities at network firm Level 3.” At the even net contributors to the moment we have about 60MW of backup would not impact on a backup generator. grid? Last November, DCD power systems.” That's priority one.” He says the data center DZettastructure hosted a debate - and here are To take part in these programs may operator keep control over the generators. the highlights. require an investment in more suitable “The customer gets paid just for having The idea isn’t new: “A long time ago I generators, said Russell: “This is also an it available,” he said. “If we do use it we can surveyed 100 data centers for Enron, with opportunity to invest in those assets and operate it from the virtual power plant which a view to making them contribute power pay for that investment through a return on we have in our headquarters in Windsor.” back to the grid using their generators,” said either revenues or cost savings, if a generator Using the on-site generator can allow energy expert Professor Ian Bitterlin. “It is coming towards the end of its natural life.” companies to take part in the utility’s Triad turned out not to be practical then, because The scheme can have a payback of three scheme, which affects pricing based on the the generators weren’t rated for that job, and years or less, he said. demand in the three busiest half hours of the the company didn’t want to use its generators Running generators with a load year, said Balchin. “Our colleagues are already to help the grid - when the generators were occasionally is also good for them, so the operating the scheme in North America, there to protect them from the grid.” schemes can generate other benefits for so it's not a brand-new concept for Level 3. That point of view is still widespread, participants. "Running them on load is good We're expecting to run our engines no more but now the idea has been proven. The for them, running off load is bad for them," than 70 hours a year.” generators at Microsoft’s Cheyenne site said Bitterlin. The actual value of such schemes needs are run by the local utility and serve both However, there are risks: “If you're close scrutiny, said Prof Bitterlin: “The data Microsoft and the surrounding community. running on the grid you're running through a center never runs at full load so unloading The need is increasing. “We’re switching transformer back to the grid,” Bitterlin said. “If the data center from the grid to reduce the off coal, which is 30 percent of our supply,” you have a short circuit event you may lose grid’s load only frees up a proportion of the said Emma Fryer, of trade body TechUK. your data center.” facility’s capacity. You may have 10MW of Russell, agrees: “There are very, very generation on site, but if your load is only Utilities in various countries offer few investments without risk, and I'm not 4MW then taking yourself off the grid is demand reduction schemes, whereby a suggesting for a moment that we risk the only giving the grid 4MW of relief, whereas facility can elect to switch over to its backup operational integrity of anybody's data center. running it in parallel with the grid and back- system at times when the grid is likely to The due diligence that we would undertake feeding onto the grid gives the grid 10MW.” be heavily loaded. The benefit is that extra would ensure that the work we're doing capacity is not needed, and wasteful fossil A lot of this has been a phony war so far. base load generation is not required. Some data centers in London have belonged “The days of the passive consumer are to similar schemes for years, and never been coming to an end,” said Russell Park, of the called on to donate their power. “They've Distributed Energy team at Centrica (the had rewards and never had to take the risk parent of British Gas). The Centrica team "A search engine because the global rumors of the grid’s death is “dedicated to developing demand side have been exaggerated for quite some time,” response and distributed energy,” he said. might take the said Prof Bitterlin. “We’ve diverted some £780 million ($973m) risk of demand In the end, each possible participant has from generation into distributed energy.” to assess the risks and rewards, said Bitterlin: Data centers are key to this, because reduction; for a “I can imagine a search engine thinking they have high-grade generators ready to the risk-reward was well worth paying for use, which could significantly help the grid. bank the reward because they didn't even care about the However the facilities are also critical, so they is negligible," 20-minute outage, whereas a bank would will not do anything which poses significant think the reward was negligible in relation to risk to their services. Prof. Ian Bitterlin the risk.”

40 DCD magazine • datacenterdynamics.com Power + Cooling

> Energy Smart

June 19 2017 San Francisco

Meeting the digital infrastructure energy challenge We heard from:

Doug Balchin Demand for power is growing while our Director, critical impact on the world bcomes ever more infrastructure and energy clear. This leadership summit will get to Level 3 the heart of the most crucial issue for Ian Bitterlin data centers and digital infrastructure. Consulting engineer DCD>EnergySmart brings together & visiting professor the supply and demand side, in a summit Regulations may have to catch up Critical Facilities Consulting where grid power executives and data with the scheme, however, said Fryer: center leaders meet under one roof, to “The minute you move to elective Andrew Donoghue understand the radical changes to our generation, you start following foul of European research manager infrastructure. air quality control legislation.” Running 451 Research Before the DCD>WebscaleCaption style event can go here diesels will be seen to be a bad thing, opens in San Francisco, the EnergySmart even if they are feeding the grid and Emma Fryer summit will hear and shape a debate avoiding other emissions elsewhere Associate Director which extends from the silicon chip to but using them for disaster recovery is techUK the hydro power plant. allowed, and feeding the grid is seen Disaggregated data centers will feed Russell Park to be a matter of choice: “I haven’t won from adaptive energy grids. Both will be Head of energy & that argument with [the regulators] yet.” re-invented to meet the needs of the sustainability solutions The argument isn’t a done deal edge, and both will be exploit and deliver British Gas either, because small generators on the Internet of Things revolution. data center sites are arguably a lot less DCD>EnergySmart is the first in a efficient than the plant in the centralized series of summits around the world. stations they displace, said Bitterlin: “When a diesel engine burns fuel oil, it's any bigger, and embed them in hotels, www.dcd.events 33 percent thermally efficient. When you hospitals, and office complexes around burn gas in a combined cycle turbine it’s the city. Take all the waste heat direct nearly 60 percent efficient.” from the chips and use it. A hotel can We can still hope for a technology absorb 100-150kW of hot water on a solution, according to analyst Andrew 24 hour basis. The energy goes in, the Donoghue of 451 Research: “Battery hotel gets free heat, and the mini edge storage is now becoming far more computer center runs at zero cost.” viable. We’ve done a lot of evaluations It has to be edge, because heat can’t and it’s now a very compelling be transported, he said: “The grade of argument to attach additional batteries heat is so poor that you can’t export it to balance the grid." further than a hundred yards.” Although batteries are still expensive The answer will involve a compared with the amount they can combination of all these ideas. “There’s store, they could level some peaks of a palate of technologies which are demand, and even a small difference available to all of us, and every situation there could reduce the generation requires a different hue,” said Russell. capacity required. “I think there is an uncomfortable “Storage is the Holy Grail,” said marriage between government, the Balchin. “Solar power is fantastic National Grid and the suppliers - if you during the nice sunny day, but not in can have a marriage for three people. November cold evenings.” But we know we have got to come up Another end-game might be small with a solution.” edge facilities, said Bitterlin: “It’s easy to foresee a time when data centers don’t The Big Discussion on data take any energy at all. It’s simply ‘follow center energy took place at DCD the edge.’ Use 100kW data centers, never Zettastructure in London, Nov 2016.

Issue 21 • April/May 2017 41 > Webscale | San Francisco

EVENT PREVIEW

The digital factory of the future will be Webscale and EnergySmart

June 19-20 2017 // Marriott Marquis, San Francisco

In the zettabyte-scale era, the data center is the new engine of the economy and electric power becomes the new oil Marriott Marquis Hotel, San Francisco

yperscale digital already truly digital-first, Internet-facing, digital infrastructure cake that Autodesk’s infrastructure has got network-edge and cloud-native. They are advanced tools and generative design data a new name - business increasingly software-defined and data doesn’t touch, transforming the design/ scale - to best describe driven; incorporating AI and machine build experience with AR/VR immersive the ubiquity with which learning into both their external business visualization of generative 3D building webscale technology development and their internal, full-stack design/build computation. Harchitectures touch nearly every aspect of infrastructure management. Lloyd Taylor is Autodesk’s senior modern economic life. Digital infrastructure “This conference is all about where this executive responsible for the company’s transformation comes with a transformation class of enterprises is headed, and about cloud infrastructure, that delivers the “Future of electrical energy - increasingly clean, what others at an earlier stage in the digital of Making Things,” moving data from carbon-neutral, smart, and automated. These transformation of their businesses can learn design through to implementation. “It’s the two realms of infrastructure systems are from the leaders.” application of industrial IoT to products or profoundly intertwined. building operations or through to the actual The two-day DCD>Webscale Conference  Transforming the world of manufactured product, such as the iPhone,” and Expo, June 19-20, at the San Francisco making stuff says the computer scientist. “We have the Marriott Marquis has two central ideas: Day 2 - the Webscale Summit on Tuesday, ability to run full product and building ops webscale cloud enterprises like Google, June 20 kicks off with a bang. Autodesk simulations through AI/ML techniques Microsoft, Amazon, HPE, IBM, eBay, is iconic in architecture, engineering and beginning at the design phase.” Facebook, Salesforce, LinkedIn, Microsoft, construction, from data centers to giant Following Lloyd’s keynote, data center Oracle and others will drive the new airplanes and beyond. or just about anything journalist Rich Miller, founding editor of economy; and smart, digital “Energy 4.0” will else, than is Autodesk, and today the software Data Center Frontier, leads a panel of AI and make it possible. giant that serves just about everyone who machine learning experts on how this high- “Every business today is somewhere on makes stuff, is bringing AI and machine form of software-defined smart technology the webscale transformation journey,” says learning in the cloud to its global customers. is being applied to transform the modern conference chair Bruce Taylor. “Some are There likely isn’t a layer of the entire hyperscale cloud data center beast.

Headline Sponsors Principal Sponsor Lead Sponsors in the downstream power path within the data center.” It’s in this full path, he says, that smart power data technologies, such as the industrial IoT, big-data and predictive analytics, are now being applied to improve power reliability, security and efficiency. “Cloud services giants like Google and Microsoft are like big petri dishes for what’s possible in wringing carbon out of the power grid in the time of climate change,” he says. Helping to bring together the knowledge resources for this DCD>EnergySmart leadership roundtable series are such organizations as the Electric Power Research Institute (EPRI), The Green Grid and the Business for Social Responsibility (BSR) sustainability organization and others. EnergySmart is a limited-seating invitational executive roundtable series. Apply now to join it.

 IoT Talent Consortium and the Infrastructure Masons Two other influential groups will co- produce workshops on June 19. In the afternoon, the IoT Talent Consortium is an industry group including Cisco, General Electric, Global Knowledge, Microsoft, Rockwell Automation, Disney, Microsoft, MIT Sloan School of Management, Pearson Workforce Readiness, New York Academy of Sciences, the State of Illinois, and IQNavigator. IoT Talent’s sole focus is IBM, Splunk, CBRE, Schneider Electric, of smart, digital transformation as is the data on discovering and developing the talent Intel, LinkedIn, Microsoft, Spotinst, Rittal, center itself. that the IoT industry will require in the Google, NetApp Solid Fire, TicketMaster/ DCD>EnergySmart is convening a future. LiveNation, Switch - These are just some of full-afternoon collaborative leadership Faculty for this unique IoT career the brand names across the full spectrum conversation, bringing together the workshop includes MIT Sloan School of the digital infrastructure ecosystem primary energy and power stakeholders Associate Dean Peter Hirst; Rockwell bringing their A-game expertise to the for the new economy. “For the first time Automation’s Connected Enterprise DCD>Webscale conference and expo to share ever, we’re asking the people who carry development director Elizabeth their knowledge of how the hyperscale cloud the weight of responsibility for the future,” Parkinson; Gordon Feller, director, Cisco factory and will evolve toward the future of says George Rockett, DCD’s founding CEO. HQ; Patty Burke, director of innovation the global economy. “We want to gather a broad representation and leadership solutions, Center for University of Southern California’s Dr. of the executive and technical professional Creative Leadership. Julie Albright, who stunned the opening leadership from the supply side and the Registration for IoT Talent also gains keynote audience for DCD>Enterprise in demand side of the new industrial economy, all tutorial participants access to all of the New York in March with her revelations about along with policymakers and regulators to DCD>Webscale conference on June 20. the impact of social and mobile media on explore common smart technology interests.” Infrastructure Masons, founded by Uber society and culture, will lead a number of What do data center power engineering Compute’s Dean Nelson, will meet with conversations over the course of both days. and management professionals need to Silicon Valley data center executives for “That’s one of the most profound talks know and what can they offer to the electrical brunch to consider big future challenges to I’ve heard in a very long time,” said Christian energy community? The roundtable topics the digital infrastructure industry. Belady, who leads Microsoft’s global will cover power utility generation, smart infrastructure and strategy, after Albright’s energy network distribution, no-carbon www.dcd.events New York keynote. renewables, regional and national smart grid transmission, microgrids, co-generation,  Energy 4.0 - Clean, smart, secure, demand response, storage and battery always-on advancement, through the facility meter to Day One, Monday, June 19 is dedicated to the the power path within the data center. digital infrastructure ecosystem of electrical “For the purposes of this conversation, energy - the “oil of Industry 4.0” - without we’re dividing the electrical energy universe which the digitization of the global economy into two segments,” says Taylor. “North of the is impossible. meter is everything on the public and private The electrical energy industry is electric utility generation/transmission end; undergoing the same scale and complexity and south of the meter includes everything > Community

Events > Middle East | Dubai May 9 2017 Upgrading the region’s critical data infrastructure for IoT, cloud and smart cities Training “The dilemma for all IT Data Center Cooling organizations is not whether Professional: London, UK to change the way they April 19-21 2017 do things, but which of Data Center Power the new paths to take. Professional: London, UK [This] conference program April 24-26 2017 embraces the need for Data Center Design a much more holistic Awareness: London, UK discussion about the future April 24-26 2017 of data center and cloud infrastructure that I am sure will help people make better informed decisions.” Kfir Godrich | BlackRock

Training Data Center Design Awareness: Lima, Spain April 17-19 2017, TBC

Data Center Cooling Events Professional: Madrid, Spain April 24-26 2017, TBC > Webscale | San Francisco June 19-20 2017 The global summit for webscale infrastructure builders

Training Data Center Design Awareness: New York, USA April 24-26 2017, TBC

Events > Argentina | Buenos Aires Every year we can see that April 25 2017 more senior people are The congress on digital attending this event. The way infrastructure and cloud the program is structured > Colombia | Bogotá and the fact that is covers so June 14 2017 many different topics at the Training same time makes the event even greater. Data Center Design Awareness: Brasilia, Brazil Sinem Cantürk | KPMG April 10-12 2017, TBC

44 DCD magazine • datacenterdynamics.com DCD Calendar

Simply the best data center conference in the Middle East. Dick van Bladel | IBM

Events > Indonesia | Jakarta April 6 2017 IT transformation and hybrid architecture

> Focus On | Hyderabad April 27 2017 The future of India’s digital infrastructure

> Malaysia | Johor Bahru May 18 2017 Data center and cloud for the new digital economy

Training Data Center Design Awareness: Jakarta, Indonesia April 17-19 2017, TBC

Energy Efficiency Best Practice: Jakarta, Indonesia April 20-21 2017, TBC

Data Center Design Awareness: Bangkok, Thailand April 24-25 2017, TBC

Critical Operations Professional: Bangkok, Thailand April 26-28 2017, TBC

It was a great event in 2016 in a venue that creates much more intimacy and even better customer interactions. The content was fantastic with some great speakers from blue chip organizations. All in all, a great learning and thought leadership event. Matthew Baynes | Schneider Electric

Issue 21 • Apr/May 2017 45 Viewpoint

Get these humans out of my data center!

umans were never meant to work in a data center environment. They are soft, fragile and susceptible to electric shock. They forget best practices, make mistakes and invite security risks. They are driven by emotions like anger, greed Hand a desire for revenge. And yet we still design data centers around people, not around technology. If we could eliminate humans from our facilities, infrastructure would become more reliable and more efficient. We could raise operating temperatures from the recommended range of around 20°C to 25°C (68-77°F) all the way up to 45°C (113°F), resulting in considerable savings on cooling costs. Several vendors have previously confirmed that their hardware can take this kind of harsh treatment. We wouldn’t have to care about air quality or humidity, or even keep the lights on. Most importantly, we would eliminate human error – one of the main reasons for unplanned downtime. According to research by the Ponemon Institute, 22 percent of all data center outages in 2016 were “Look at you, caused by people, not hardware. hacker: a pathetic There are plenty of organizations that are taking important steps towards automating their infrastructure. For example, Microsoft creature of meat Research continues its work on Project Natick, sinking autonomous and bone, panting data centers to the bottom of the ocean. AOL claims to have operated a ‘lights out’ data center called ATC since 2011, but details about the and sweating as facility are few and far between. Both IBM and EMC previously equipped iRobot’s Roomba – the you run through autonomous vacuum cleaner - with temperature sensors in order to my corridors. build heat maps of their server farms, representing one of the early uses of actual robots in a data center. How can you Sony’s Everspan Library System is another example of data center robotics: this modular appliance holds thousands of high capacity challenge a Archival Discs that are managed by an automatic arm, like a futuristic perfect, immortal jukebox. Meanwhile American startup called Wave2Wave has developed Rome, a series of switches that enable physical fiber connections to be machine?” made automatically. Software automation driven by tools like Chef and Puppet and trends like SDN are also doing their part to help minimize the need to interact SHODAN, System with actual hardware. Shock And there’s no reason why we couldn’t completely eliminate people from the equation: after all, a data center is nothing but a warehouse full of servers, and warehouse automation is something that is being actively investigated by several retail giants, with Amazon emerging as one of the leaders in this field. We will always need people to look after the infrastructure, but soon, a time will come when we no longer need to share the same space.

Max Smolaks News Editor

46 DCD magazine • datacenterdynamics.com

> Software-Defined | Supplement

Powered by

Automatic for the data Beyond software-defined, Peace, love and software- center operators to business defined defined networking

> The software-defined revolution > Let’s move up the stack, and > SDN is changing the role has one main aim: to make allow business logic to interface of switches, along with the

INSIDE services that are automatic with automatic services companies that make them

A Special Supplement to DCD April/May 2017

Powered by Coming Soon! The Path to the Cloud and the Role of the Network For more information visit datacenterdynamics.com/webinars

Contents A question of definition

Features as the software- technical aspects of the solution, 4 Automatic for the people defined data center will be the arena where SDDC 10 Let's aim for business defined! (SDDC) arrived makes its mark, although we yet? I'm afraid that suspect users may be unaware of 12 Peace, love and SDN depends on what the impact of the technology: the you mean by an concept and the abbreviation of Opinion HSDDC. Like so many other concepts SDDC is far too nerdy to ever be in this industry, the idea of a a consumer brand - even for the 14  SDDC has to get critical software-defined data center can kind of consumers we meet in mean several different things - and this market. some of those things aren't really here as of yet. Networks have been the test case in many ways. Virtualization Deep down, it's all about had taken hold in the server space, automation. IT services have been and storage was acquiring levels of made simpler in the cloud, but abstraction that effectively turned it SDDC promises to automate their into a pool of capacity. delivery, so they can be provisioned Networks seemed to be stuck and maintained without human with proprietary systems, and intervention. uncessary hardware, till software- To this end, we need software- defined networking (SDN) defined storage, software-defined separated the functions from the networking services, and software- the underlying hardware, created defined compute. All these should the "software-defined" moniker, be available as pools of resources, and now lead the way (p12). 10 to be deployed with a few clicks. We're making progress towards Physical infrastructure can be that (p4). Platforms that offer overlooked in all this, but a data these pools are emerging from center is not software-defined until companies such as HPE, as well as all its resources can be managed from the collaborative efforts going and controlled by software. into the OpenStack project. DCD's Bruce Taylor (p14) says The open source options may that the software-defined data be less mature and more complex, center is little more than a sham if but the proprietary routes may, as it only manages the IT stack, north always, involve possible lock-in. of the rack. It needs to extend its 4 power south into the mechanical Further up the stack, we come and electrical infrastructure if it is to the levels where users encounter to deliver its promises. the SDDC (p10). Here, we won't Get that part right, and the Pea e, argue about the merits of various whole facility is defined by packages and protocols. software and delivered as an easily L ve The question will be whether a consumed service. given SDDC approach is practical. As Bruce suggests, this is when Does it allow a service provider software finally eats the data center. & to meet customer needs better, perhaps by offering new services? Peter Judge 12 SDN This, far more than the DCD Global Editor

DCD Software-Defined Supplement • datacenterdynamics.com 3 ON TI A M O T

U

A

Dan Robinson Freelance Automatic for the data center operators Everything is going software-defined, says Dan Robinson. And the main thing this means is automation

4 DCD Software-Defined Supplement • datacenterdynamics.com FLEXIBILITY A Special Supplement to DCD April/May 2017

ON TI SMART A M INDUSTRY O T

U

A

Getting the Synergy right HPE’s Synergy infrastructure platform, which the firm began officially shipping in January, is one example of what infrastructure designed from the ground up to be software defined looks like, although HPE does not refer to it as such, preferring to call it “composable infrastructure.” The big idea behind Synergy is that is "defined by code," according to the firm, meaning that it is BIG DATA designed to be almost entirely driven by templates that specify the resources required for specific applications and services, and how these should be configured. "It's a stateless computer, designed to be a reconfigurable pool of resources," according to Buss. It can be regarded as a set of storage, networking, compute and memory resources that can be pulled together to create a virtual computer for whatever task is at hand. The key part of Synergy is ver the past several years, Solving this problem calls for a certain that HPE’s updated OneView “software-defined” has degree of automation, according to management system can provision become another of those Andy Buss, consulting manager for data and manage the bare metal, so it terms that gets used so center infrastructure and client devices can be used to stand up a database much that its meaning at IDC Europe. cluster running on dedicated becomes blurred. However, “Any business that is on a journey to hardware as easily as provision Osoftware-defined infrastructure is a key part digital transformation needs automation, virtual servers and applications of the way data centers are currently being as automation is the ability to put actions running in containers. reshaped in order to meet the changing into IT policy,” he said, adding that “moving The caveat is that Synergy is requirements of modern IT and applications, from dev to ops is about being able to deploy effectively a combination of special which are now more distributed and [infrastructure] as easily as possible.” hardware and software that is dynamic in nature. The implication of this is that the available only from HPE. While infrastructure must be capable of being customers can choose from a range One of the stumbling blocks that reconfigured or repurposed using software, of compute and storage nodes to enterprises and data center operators face is rather than having engineers go into the suit their requirements, they cannot that IT infrastructure is increasingly complex data center itself and physically reconfigure mix and match with third party kit to deploy and configure. Applications often the infrastructure. within a Synergy deployment. depend upon an array of other services and This kind of approach is already used resources to be in place before they can extensively by many of the biggest Internet function properly. companies, such as Google and Amazon u

DCD Software-Defined Supplement • datacenterdynamics.com 5 u Web Services (AWS), as it is a fundamental infrastructure using a mix of arrays, of the network, from the forwarding plane, requirement of operating a public including those from multiple vendors, and typically the switch hardware that actually cloud platform, where services may be manage it all from a single console. routes packets around the network. The idea continuously starting, scaling up, and But software-defined storage more here is to centralize control over network eventually releasing resources again when commonly refers to software such as the traffic, using tools such as OpenFlow, a they terminate. open source GlusterFS that runs on a cluster protocol designed to provide a standard In other words “software-defined” refers of commodity server nodes and uses these mechanism for this, which is supported by to infrastructure and resources that are to create a scalable pool of storage. many switch vendors, such as Dell, HPE, decoupled from the hardware they are This model is seen in hyperconverged and Cisco. running on, and are able to be dynamically infrastructure (HCI) systems, in high A different approach is to virtualize the configured and reconfigured under performance computing (HPC) clusters, and network, allowing the creation of logical software control, typically using application in very large Internet companies like Google networks that use the physical network programming interfaces (APIs). and Facebook because it is easy to provision to move data around, but which do not Among the first components of IT and scale. necessarily have the same IP address infrastructure to get the software- However, software-defined scheme, and may each differ in their quality defined treatment were servers, storage is only as good as of service and security policies. through virtualization. A virtual the hardware it is running This latter approach is typified by machine running in a cloud is on, and may not be as VMware’s NSX and the Neutron module effectively a software-defined reliable as purpose-built in the OpenStack platform, and is vital for server, since it has been $25.6bn enterprise storage arrays. supporting multi tenancy in data centers, decoupled from the underlying revenue from SDDC The assumption is that especially those hosting public cloud services. physical hardware, and can in 2016 (Markets redundancy and other be moved from one physical capabilities like data Virtualizing the network not only andMarkets) server to another in the event replication will be taken enables new network connections to of a hardware failure, or for other care of elsewhere in the be created without having to physically reasons such as load balancing. software stack. move patch cables around, but in the The past few years has seen the rise Software-defined networking case of VMware’s NSX, it also enables of containers, which are less demanding similarly covers a number of greater oversight of network traffic, since of resources than virtual machines. different approaches designed to the switching and routing capability Platforms such as Docker also have make the network infrastructure is integrated into the hypervisor and more of a focus on distributing and more dynamic and easy to distributed throughout the infrastructure. managing applications as a collection of configure as required. containerized functions. One approach is to separate the These strands – software-defined As trends like cloud computing have control plane, or management part compute, storage and networking – do become an increasingly important part of the data center, the move to make more and more of the infrastructure software- defined and therefore more agile has grown. However “software-defined” can mean different things when the term is used by ORCHESTRATION different vendors, and there is not always a widely agreed standard definition. For example, software-defined storage can include storage virtualization, such as EMC’s ViPR platform, which enables a customer to build their storage

PUBLIC LOAD BALANCING CLOUD SERVICES

6 DCD Software-Defined Supplement • datacenterdynamics.com A Special Supplement to DCD April/May 2017

not exist separately, but are very often interdependent. Pulling them all together is Open source SDDC module, is not tied to a specific the ultimate aim, in order to deliver what is OpenStack is well known as an open hypervisor. It uses the KVM by termed by Intel and others as the software- source software project, widely used default, but also supports those from defined data center (SDDC) for cloud deployments. However, VMware and Microsoft, as well as With SDDC, the entire data center its strength lies in the fact that it is container technologies such as LXC. infrastructure should be configurable under actually a management framework Likewise, the Neutron networking software control, making it easier to automate that is designed to be as open and and Cinder storage modules plug so that IT staff can devote less time to simply extendable as possible, making it a into a variety of other products and keeping the lights on, and this means that general-purpose "integration engine" platforms, as in VMware Integrated the fourth piece of the puzzle is management for operating IT infrastructure. OpenStack, where they are used and orchestration. This can be seen from the atop VMware’s NSX and VSAN, “It’s pointless having software-defined number of telecoms operators respectively. anything without automation and that are turning to OpenStack as If OpenStack has a weakness, orchestration,” said Buss. He cited tools such the linchpin for a modernization it is that some aspects, such as as Microsoft’s System Center Operations of their infrastructure, allowing automation and orchestration, are Manager (SCOM) and VMware’s vCloud them to replace costly specialized not as mature as in some proprietary Director as examples, but added that these network hardware with software cloud and data center platforms. have yet to reach the same level of maturity running on commodity servers that However, thanks to its open APIs, as those used by the big cloud providers. does the same job, an approach developers and third party tools can Also less mature but gaining support known as network functions fill in the gaps. among service providers is OpenStack, virtualization (NFV). These APIs mean that users can which is open and non-proprietary and thus One of the reasons for this is reconfigure their infrastructure independent of any single vendor, and which OpenStack’s flexibility; it is now under programmatic control, a presents a set of APIs that can be used to plug organized around core services fundamental aspect of a SDDC. in other software and services. including compute, storage and OpenStack has a wide number of Overall, the SDDC approach has passed networking modules, while others are backers in the technology industry, an inflection point and is growing in optional and can be used to provide including Intel, which has been importance, according to Buss. services as required by the user, using the software itself internally “The public cloud demonstrates it is including telemetry, orchestration since 2011 to manage its own feasible, and as we move to a multi-cloud and even a database service. infrastructure, as well as contributing world, there will be a need for compatibility Nova, OpenStack’s compute code to the project. between clouds which is driving a lot of thought in the industry now.”

Containers open the cloud App containers take virtualization even further, says Max Smolaks. Think of it as yet another level of software-defined abstraction

hile virtual App containers are completely This approach is especially useful machines have independent of the underlying for massive software projects running done wonders infrastructure and can be easily moved in cloud environments, since it allows for hardware across servers, data centers and cloud individual services within applications utilization, a platforms. At the same time, seamless to be scaled up or down depending on new approach transition between development, test and demand. It also enables developer teams Wto workloads is emerging that could production environments makes this to work on software without having to be enable data centers to use their technology indispensable for DevOps. aware of what the other teams are doing. resources even more efficiently. And finally, due to their policy-driven Kubernetes was originally developed Kubernetes and Docker are the nature and isolation from the host, at Google while Docker was created most popular examples of systems that applications packaged in containers are by Solomon Hykes at the eponymous automate the deployment of applications fundamentally more secure. Docker Inc (formerly dotCloud). Today, inside software containers. They package Containerized apps represent an both are open source projects with a apps and all of their dependencies into important step towards microservice massive following, which means they will individual environments that all rely on architecture – where various services continue to evolve rapidly in response to the same Linux kernel, which means they are packaged separately and chained the demands of the early adopters. require less overhead than VMs. together using orchestration tools. Containers look like the future.

DCD Software-Defined Supplement • datacenterdynamics.com 7

as the data center reached the pinnacle of automation? There is no doubt that facilities are more “robotic” and self-serving than ever Software- before, but there could be Hmore to come. Software is continuing its quest to become “the great definer” at the heart of the software-defined data center (SDDC), and the data center itself is taking on defined? Chris MacKinnon a more chameleon-like shape. North America correspondent To Matt Altieri, marketing director at Device42, the SDDC means software-defined Let’s aim everything, where the idea of virtualization is now extended to all parts of the IT stack, resulting in delivery of infrastructure as a service (IaaS). “Software-defined data centers are not for business only simpler and more manageable,” Altieri told us, “but they also more easily align with companies’ business needs. The speed of deploying systems is dramatically increased.” defined! He says SDDCs have the flexibility to take on many configurations, capabilities, and forms, and can support companies ranging The real benefit of the software-defined data center from the size and complexity of the FANG companies (Facebook, Amazon, Netflix, comes when you connect it to a layer of business and Google) to companies with very much logic, reports Chris MacKinnon simpler operations.

10 DCD Software-Defined Supplement • datacenterdynamics.com A Special Supplement to DCD April/May 2017

The increase in simplicity realized the business logic layer makes the painful from “converged infrastructure,” as Altieri decisions in terms of which tradeoffs have to put it, paved the way for delivery of IaaS: be made for IT to be optimally aligned with What do the next Amazon Web Services changed the world the business.” ten years hold? of IT by pioneering a new way to deliver This could mean accepting the slow The concepts behind the software- IT, and the other public cloud providers performance of a specific application as a defined data center have been such as Microsoft, Google, and IBM price for optimizing the user experience dismissed by some as hype. Some soon followed. Altieri said these of another, much more business- critics believe that a minority providers allow you critical application. Volk of companies with completely to use application said these decisions homogeneous IT systems already program interfaces cannot be effectively in place (FANG companies, (API) to create the made in today’s siloed specifically) can truly transition to infrastructure you IT, as individual software-defined data centers. need to run your operators do not have The FANG companies have business on demand. access to the business implemented this vision by A SDDC differs from a 26.6% context. investing years of engineering time private cloud, which predicted annual growth For Michael and billions of dollars. But in our only offers virtual- in SDDC till 2020 Letschin, field CTO at view, it is possible for any company machine self-service, (MarketsandMarkets) Nexenta, SDDCs are to start on the road to ITaaS and beneath which it currently changing reap their business-specific could use traditional the entire data center benefits. One company’s approach provisioning and landscape. In the past, to ITaaS will likely be different than management. Instead, facilities were made up that of other companies, because SDDC concepts describe a data center that using big block systems from legacy players, it will be built to support unique can span private, public or hybrid clouds. purpose-built to support a narrow, pre- business needs. For example, Torsten Volk, managing research director defined set of workloads. Facebook’s data center was created with Enterprise Management Associates, “This limited the end user’s choice for specifically to meet the demands and no stranger to the SDDC arena, technology and usage,” Letschin said. “The of its product offerings; Cassandra believes that only in a SDDC can customers newer software-defined storage model sprang out of Facebook’s needs. achieve policy-driven and fully-automated is shifting this paradigm and is giving Therefore, rather than look at application provisioning and management. organizations the ability to run an agile, what the FANG companies have Volk said: “Today, when the request comes scalable and cost effective SDDC.” done and try to duplicate their in to host a specific enterprise application, Additionally, the selection of vendors for capabilities, start by understanding IT teams (for storage, networking equipment the data center has expanded and challenged your own data center and your and servers) have to crack open multiple the concept of "never getting fired for business requirements. The task hardware vendor specific command-line buying IBM,” Letschin said. In today's SDDC, is to figure out what you have and interfaces (CLI) and control panels to serve management and administrative staff are then make that more adaptable. up the storage, network and compute encouraged to look for innovative software- You must understand how to best resource pools needed by the virtual defined solutions on the market to address support the applications needed by machine administrator to create the virtual their data center pain points. He continued: your business so they can scale and application environment.” But Volk said this “The benefits of transitioning to an SDDC adapt to your needs. If your data is error prone, requires vendor specific skills, also extend beyond the core technology to center is starting from a point of and can take days or weeks. He said these deliver significant space, power and cooling too much complexity, you’ll need to issues are the reason for the rapid growth of reductions. As a result of the clear business simplify and pare down. Microsoft Azure and Amazon Web Services, benefits that software-defined storage Matt Altieri, Device42 where line-of-business (LoB) developers (SDS) provides, an increasing number of already have fully-programmatic access to enterprises are shifting to the SDDC model. everything they need. According to Letschin, if the last five to ten years have been any indication, it will The “business logic layer” has always be almost impossible to predict the growth been Volk’s vision of artificial intelligence/ in the software-defined data center, but it the same time, remain the most consistent. machine learning-driven (AI/ML) will include integration with public cloud Also, data integrity will continue to be key for intelligence that can manage hybrid IT solutions to give even more “just in time” compliance and security purposes. However, infrastructure in a manner that accounts for solution capabilities. He said: “We will see the the use of commodity hardware will become the business importance of each application, integration of multi-data center capabilities. more prevalent in the future with the ability as well as for the organization's permanently On the compute side, the rise of containers to cram more capacity in less space while changing strategic priorities. In short, the will bring the application back to the providing higher performance. “One thing business logic layer turns the SDDC into the forefront and the idea of software-defined that will start to become obsolete in the data business-defined data center (BDDC). compute running on all full virtual machines center,” Letschin said, “will be ‘performance- Volk elaborated: “The crux is that we will become obsolete. This increase in only’ solutions, and general-purpose storage can only make the step to the business- application based compute will lead to more will likely make a comeback because of the defined data center if we make IT operations automation, self-service and self-supporting flexibility inherent to SDS.” management sensitive to the business infrastructure with AI taking a bigger role in Overall, Letschin says the data center will context, instead of rigidly enforcing best actively managing the SDDC.” continue to operate as the nervous system practices for application management. For Letschin said storage will most likely of the business, but will be much more agile example, in a business defined data center, undergo some of the biggest changes and, at than the data center we recognize today.

DCD Software-Defined Supplement • datacenterdynamics.com 11 Pea e, L ve & SDN

irtualization has been a blessing for data centers – thanks to the humble hypervisor, we can create, move and rearrange Peace, love computers on a whim, Vwithout thinking about the physical Max Smolaks infrastructure. News Editor The simplicity and efficiency of VMs has prompted network engineers to envision a and SDN programmable, flexible network based on open protocols and REST APIs that could be managed from a single interface, without Software-defined networking is worrying about each router and switch.

changing the role of switches, and The idea came to be known as software the companies that make them, says defined networking (SDN), a term that Max Smolaks originally emerged more than a decade ago. SDN also promised faster network deployments, lower costs and a high degree of automation. There was just one problem – the lack of software tools to make SDN a reality. This was the hurdle faced by all networking equipment vendors, but those who saw that SDN would eventually become the norm, realized that success in this field would require a wide ecosystem of partners, even if they were also your competitors. As with so many other areas of IT, the answer is in open source. If developments

12 DCD Software-Defined Supplement • datacenterdynamics.com A Special Supplement to DCD April/May 2017

was founded by go and I talk to a customer about SDN, former Juniper because that’s my job: I lead with Contrail "We're giving employees who and we talk about the evolution and cloud worked on the platforms and systems. But if I’m talking to a engineers popular MX hardcore network engineer who just builds series routers data centers and doesn’t have visibility into tools to and switches. the applications that are running – that modernize “People from oftentimes goes over their head. Juniper left the “So I’ll start talking about how you operate how they company to go it – because really, what we are trying to there, so there do with SDN is just automate the network manage was always a provisioning process, and optimize how the networks" relationship,” network works for cloud platforms.” Scott Sneddon, Contrail is meant to complement senior director OpenStack, which means it’s designed for for SDN and cloud computing at scale. The centerpiece Cloud at of the platform is the SDN controller, which Juniper, told DCD. “Even Kireeti Kompella, defines how the network is laid out and what one of our early, really strong leaders in the the topology is. It is accompanied by virtual development of MPLS [multiprotocol label routers that have to be installed on every switching] who had a lot to do with Juniper’s server and linked to the controller. There’s success in its first fifteen years, he went to also an analytics component which monitors become a CTO at Contrail.” the state of the network, and a number of APIs for control functions. Those were the days when everyone was “The physical network that’s in place just looking to buy into SDN. A famous example becomes a transport layer that passes the is VMware, which won a bidding war against packets over a routed network,” Sneddon Cisco for a higher-profile startup, Nicira, explained. “And we know how to manage whose founders included Nick McKeown and routed networks very efficiently. And then Martin Casado, the researchers at Stanford we take the really complex service layer, the who created the SDN concept. Nicira’s SDN things that have to change when I start a new implementation is now in VMware’s NSX. application, or spin up fifty VMs that span Nicira was somewhat more expensive across a massive data center, and we build than Contrail: VMware ended up paying $1.26 overlay tunnels to support those – and really, billion. “Two months later, we sent a rover to an overlay tunnel is just a VPN.” Mars for just double that,” Sneddon joked. By Open Contrail and Contrail Networking – are shared amongst a wide community, then comparison, Juniper’s acquisition of Contrail Juniper’s own version - are identical in feature progress can be quicker as companies are not seems like a bargain. set. Furthermore, Contrail Cloud releases are duplicating efforts within proprietary worlds. Fast forward four years, and OpenContrail aligned with OpenStack community releases There is also a level playing field to compete is an important part of both Contrail Cloud to maintain API compatibility and keep strong on - but the best way to compete is to be on and Unite Cloud – the latter, launched in community support. The paid versions just the teams building that playing field. January, is Juniper’s data center framework add enterprise-level support and installation. This was the scene in 2012, when network that aims to simplify creation and firms started buying SDN startups. maintenance of hybrid and multi-cloud Even with the relative success of Contrail, Juniper Networks took the strategy environments. It includes access to Juniper is not content to rest on its laurels: seriously. It bought Contrail Contrail JumpStart service, which last year, the company acquired AppFormix, a Systems - a secretive startup allows customers to experiment cloud management and optimization startup. - on the cheap and quickly with open source tools. It develops a server-level tool, suggesting that published its code under an $1.26bn “Even if the customer the software-defined approach could turn open source license. price VMware paid doesn’t adopt an SDN solution network companies into something beyond Today, the OpenContrail for Nicira right away, we still have a way their old role as hardware merchants. open source project team has to have that conversation and “You’ll hear these cloud guys talk about drawn in employees from Nokia, help them evolve,” Sneddon the network, where the only awareness of the Mirantis, Symantec, Canonical, said. “I’ve been working on SDN network they have is when it’s broken. They IBM, AT&T and NTT Innovation solutions for about five years now, don’t care if you’re using BGP or MPLS or Institute, and the supported version has and I can honestly say that in more than VLANs, as long as it’s there and it works. As a created a new, thriving business within Juniper half the meetings I go into, SDN is probably side effect, the purchasing decisions – which itself. One of the company’s flagship products not what the customer needs. But even switch or which router they buy – sometimes is Contrail Cloud - a mix of OpenStack, Open without an SDN solution on top of it, we have become less important,” Sneddon admitted. Contrail, Ceph and Puppet, bundled together a really good automation framework: we’ve “So for Juniper to be relevant in the future, with a host of minor enhancements. done a lot of work to develop a bunch of we know that we have to play at a higher When Juniper bought Contrail for $176 Python, SaltStack, Puppet and Chef interfaces level. A lot of enterprise IT buying decisions million, it had never shipped a single product. for our routers and switches. are coming from cloud teams, not network Less than a year later it released the code “What we’re doing is we’re giving the teams. They have the dollars. So we feel like under the Apache license. This deal makes network engineers tools to modernize we have to develop a strong value proposition more sense once you realize that Contrail how they manage their networks. So I for cloud engineers and architects.”

DCD Software-Defined Supplement • datacenterdynamics.com 13 Opinion Software- defined has

to get critical D C D , P You can’t have a software-defined data center V | E r until it handles the mechanical and electrical lo ay e T parts of the facility, says Bruce Taylor Bruc

he true autonomous, lights-out CAGR of 26.6 percent to $83.2bn by 2021. So decade we have learned the folly of treating data center, where the facility SDDC is a real market, growing really fast. the logical and physical sides of the critical- becomes a commoditized But these figures pull together the environment world as different countries (if utility, may not be that far off, components of the IT stack only: software- not planets). as the role of hardware shrivels defined infrastructure (SDI), composed of When design and engineering and dumbs down. But has the software-defined networking, storage and professionals on both sides of the border Tability to deliver the true software-defined computing. The study includes services, such speak two different languages, this creates data center (SDDC) been overstated? as consulting, integration and deployment, threats to uptime, availability, resiliency and A few years back, cloud was a pipe but it only counts the IT infrastructure stack, efficient IT performance. dream. Cloud required network, storage north of the rack. It leaves out the south, or In the early days of virtualization, the and compute to be combined in a cohesive, physical MEP infrastructure side. pace of change was hard for facilities integrated, software-managed infrastructure. In our opinion, mechanical and electrical engineers to keep up with, as server, storage That demanded a highly abstracted infrastructure (thermal and networking technology (virtualized), automated, policy-based system and power management) advanced with every server that combined workload management, agile systems must also refresh. Power and cooling infrastructure provisioning, failover, disaster become software-defined, The true jump were static for the life of the recovery and security. And that package where the software is to SDDC only facility – at least a decade. That simply didn’t exist. data-driven, predictively is no longer true. analytical, policy-based occurs when For now, the true SDDC The virtualization and cloud pioneers and tightly integrated into automation may be limited to those knew they could pool IT resources, but in IT-stack performance organizations with deep the early days they still hadn’t really given management. is brought pockets and monolithic a thought to creating physical data center The true jump to applications – the vertically infrastructure, including the guts of the SDDC occurs only when to bear on integrated hyperscalers and critical environment: power and cooling software automation and the physical cloud services providers that (thermal management). data-driven analytical can push out the boundaries of Now the SDDC is in labor after a very long intelligence are brought infrastructure data center-as-a-service. But pregnancy, promising to deliver a software- to bear on the physical anyone requiring DevOps- and data-led unified toolbox that presents the critical-environment style Internet-facing agility data center as an abstracted private cloud, infrastructure. at the application and workload level will available to multiple customers. But the Critical environment functions have been increasingly want these characteristics from critical environment is still not there. handled under the catch-all category of data its in-house or outsourced data center-as- SDDC is needed, because digital center infrastructure management (DCIM) a-service provider. To meet the demands transformation is here. In 2016 the world and, more recently, a higher order of function placed on them, data centers must become will hit 1,000 exabytes (one zettabyte) of data known as data center service optimization open source, full-stack integrated, software- traffic on the Internet, says Cisco’s Visual (DCSO), which seeks to integrate DCIM with defined and autonomous, right down to the Networking Index, with Internet data globally IT service management (ITSM). However it is lowest level of their infrastructure. projected to grow at a CAGR of 26 percent to done, we need to see an end to the old silos. 2.3ZB by 2020. For years in the industrial world, IT and We must move towards a world where cloud We believe zettabytes need “zettastructure” operational technology (OT) have been services and capacity will not require human – open source, software-defined, data- treated as separate disciplines. Now fresh touch, except where humans are unwilling to driven, hyperscale and autonomous thinking and new technologies are giving IT let control go to software. infrastructure: true SDDC. the ability to automate and use OT data. “Software is eating the world,” Marc A MarketsandMarkets study estimates There are those who don’t think we Andreessen famously opined in the ancient that the global market for SDDC will bring need to integrate and software-define the history of 2011. That is now coming true in in $25.6bn in revenue in 2016, growing at a full stack. DCD disagrees. During the past ways he could not have predicted.

14 DCD Software-Defined Supplement • datacenterdynamics.com > Webinars | 2017

NEW – Save the date! The Path to the Cloud and the Role of the Network Thursday 11 May 15:00 BST

Speakers confirmed: This webinar will examine: Andy Ingram • Why many applications don’t necessarily move easily Worldwide Data Center Solutions Lead into the cloud. Juniper Networks • The difference between older “mode 1” applications that Stephen Worn lack flexibility and modern “mode 2” applications built CTO specifically for the cloud. DCD • What is needed in order to have an architecturally coherent network environment that can support these Everything is moving to the cloud. Or is it? In fact, many multiple generations of applications. applications are far from cloud ready, and some never • The importance of topology in the cloud-ready data will be. But almost every new application will be cloud- center network, and the various choices available. enabled. How do you evolve your data center into a private cloud and drive Hybrid IT by connecting to • How to use an open architecture ecosystem to drive public clouds, and still cater for the needs of your automation by providing software control of the legacy software? network and integrating it into the virtualization capabilities of the data center. Attend this webinar and take away practical ideas to enhance your cloud and data center planning projects. u Register: http://bit.ly/2o4NCCK

For more information please contact [email protected]