July/August 2016 • datacenterdynamics.com

The Business of Data Centers

RIFUGA ENT L T C UR B O E N C

A

B

U

L

A T

O VIRTUALIZATION

R ENGINE

HYPERSCALE PUE INLET BUSTER FUNNEL SUPERSONIC SHOCK SHOOT !

THE INTERNET OF THINGS

ALL SYSTEMS GREEN MICROFILAMENT ANALYSIS FLANGE

CONTAINERIZED MICRO DATA OPERATING CENTER SYSTEM DEPOSITORY

RECYCLING DISK MASHER

REPURPOSING FRAGMENT FLUME Netscale Solutions Delivering the highest 10/40/100GbE fiber density

R&M’s Netscale solutions combine unmatched fiber cable management with automated connectivity tracking and an innovative patch cord design to deliver the world’s highest port density for 10/40/100G Ethernet.

For more information and further details about the highest 10/40/100GbE density of any fiber solution out there, visitwww.rdm.com Reichle & De-Massari AG, Binzstrasse 32, CHE-8620 Wetzikon, +41 (0)44 933 81 11

RDM_Netscale_270x210mm.indd 1 30/06/16 11:43 July/August 2016 • datacenterdynamics.com

The Business of Data Centers

RIFUGA ENT L T C UR B O E N C

A

B

U

L

A T

O VIRTUALIZATION

R ENGINE

HYPERSCALE PUE INLET BUSTER FUNNEL SUPERSONIC SHOCK SHOOT ! Contents July/August 2016 THE INTERNET VOL 04 // ISSUE 15 OF THINGS

4 Meet the team 5 Global editor’s comment News ALL SYSTEMS GREEN MICROFILAMENT 7 ANALYSIS FLANGE 14 Asia Pacific: Australian Tier IV Cover Story CONTAINERIZED Latin America: Colombia MICRO DATA OPERATING Micro scalers 16 CENTER SYSTEM DEPOSITORY As the internet gets bigger, 18 Micro scalers take on the cloud some data centers are getting smaller 22 How the feds helped build the cloud 18 25 Energy crisis averted? RECYCLING DISK MASHER 28 Innovators north of the rack

30 The road to software-defined Servers + Storage Strength in 32 DCD community Colombia 34 Max’ssoftware-defined apocalypseIn the early days of virtualization, the pace The ZFB GroupREPURPOSING will add two computing of change was hard for facilities engineers FRAGMENT FLUME (SDC). These to keep up with, as server, storage and new data centers in 2017 pillars - SDN/NFV, networking technology advanced with every The road to SDS, and SDC – combined server refresh. Power and cooling were static as software defined infrastructure (SDI) make for the life of the facility, at least a decade. up the IT stack. The study also includes That is no longer true. services, meaning consulting, integration and 16 deployment. For now, the true SDDC may be those software-defined But the study has a big omission: it only organizations with deep pockets and counts the IT infrastructure stack, north of monolithic applications, the vertically- the rack, the logical side – and not the south, integrated hyperscalers and cloud services or physical MEP infrastructure side. providers who can push out the boundaries The software-defined data center movement In our opinion, mechanical and of data center-as-a-service. But anyone ignores the critical environment, says Bruce electrical infrastructure (thermal and power requiring DevOps-style Internet-facing agility management) systems must also become at the application and workload level will Taylor, and that’s definitely not SDDC software-defined, where the software is increasingly want these characteristics from data-driven, predictively analytical, policy- its in-house or outsourced data center-as-a- based, and tightly integrated into IT-stack service provider. To meet the demands placed Bruce Taylor performance management. on them, data centers must become open- EVP, North The true jump to SDDC occurs source, full-stack integrated, software- America only when various elements of defined, and autonomous, right software automation and down to the lowest level of their data-driven analytical infrastructure. intelligence are brought None of the to bear on the physical component infrastructure oftware is becoming ever and data-led unified toolbox that delivers critical-environmentOur tracks guide yourarchitectures path is immune through more sophisticated, adding the data center as an abstracted private infrastructure to all this. And technology performance analytics to policy cloud, available to multiple customers, with management. $83.2bn advances won’t stop. rules-based management hybrid cloud automation. But the critical Critical environmentDCD magazine,SDDC market in 2021 websiteSilicon photonics and is events capabilities, and extending that environment is still not there. functions have been (MarketsandMarkets) now ready for rapid into the realmBringing of AI. The true Tier IV The road to handled under the market adoption. Sautonomous, lights-out data center, where the SDDC is needed, because digital catch-all category of data Blockchain, the tech facility becomes a commoditized utility, may transformation is here. In 2016 the world center infrastructure behind Bitcoin and other not be that far off, andto the global Australia growth in will hit 1000 exabytes (one zettabyte) of data software-defined management (DCIM), and cryptocurrencies, could find data towards the zettabyte era will accelerate traffic on the internet, says Cisco’s Visual more recently, a higher order its way into all kinds of industries its arrival. Networking Index, with Internet data globally of function known as data center and applications such as FinTech, The role of hardwareReliability is both shriveling isprojected an toimportant grow at a CAGR of 26 percent to The software-defined data service optimization, DCSO which seeks to electronic voting, smart micro-contracts and dumbing down, with intelligence being 2.3ZB by 2020. integrate DCIM with IT service management and provenance verification for high-value art increasingly software distinctionled. But has the ability forThe obviousa drivers for this include center movement ignores the (ITSM). However it isColo done, we need to+ see anCloud asnd gems. DCD predicts blockchain will be of software to deliver the “true” software- the growth in mobile/wireless devices and end to the old silos. the common methodology for managing cloud defined data center (SDDC)colocation provider been overstated? applications; rich (including interactive) based workload capacity exchange contracts. A few years back, cloud was a pipe dream. streaming media; the IoT, big data and critical environment For years, in the industrial world, IT and Meanwhile, other technologies are being Cloud required network, storage and compute analytics; robotics; drones and autonomous operational technology (OT – the world of pumped up by venture capitalists, regardless to be combined in a cohesive, integrated, vehicles; 3D printing; VR/AR; cognitive control engineering) haveDesign been treated as +of theirBuild actual viability. DNA-strand-based software-managed infrastructure. That computing and much more. separate technical disciplines with disparate data storage doesn’t yet exist outside of the demanded a highly abstracted14 (virtualized), We believe zettabytes need cultures. Now fresh thinking and the arrival R&D lab, but that “market” is already valued automated, policy-based system that “Zettastructure” - open source, software- 30 of new technologies are giving IT the ability to at $1 billion. Quantum computing is another combined workload management, agile defined, data driven, hyperscale and automate and use OT data. non-existent market market which already has infrastructure provisioning, failover, disaster autonomous infrastructure: true SDDC. There are those whoCore>Edge don’t think we need a five-year-out valuation of $5 billion. recovery and security. And that package A MarketsandMarkets study estimates that to fully integrate and software-define the full simply didn’t exist. the global market for SDDC will bring in $25.6 stack. DCD disagrees. During the past decade In future, to accommodate rapid growth in The virtualization and cloud pioneers billion revenue in 2016, growing at a CAGR of we have learnt the folly of treating the logical demand and shifts in the underlying platform knew they could pool IT resourcds, but in 26.6 percent to $83.2 billion by 2021. So SDDC and physical sides of the critical-environment technologies, we must move towards a world the early days they still hadn’t really given is a real market, growing really fast. But that world as different countriesSoftware-Defined (if not planets). where cloud services and capacity will not a thought to creating a physical data center only tells part of the story. When design and engineering require human touch, except where humans infrastructure, including the guts of the These figures pull together the How the feds professionals on both sides of the border are unwilling to let control go to software. critical environment: power and cooling components of the IT stack only: software- speak two different languages, this creates “Software is eating the world,” Marc (thermal management) defined networking (SDN) and, we assume, its threats to uptime, availability, resiliency and Andreessen famously opined in the ancient Now, the SDDC is in labor after a very long companion, network function virtualization built the cloud efficient IT performance.Power Those risks still exist+ historyCooling of 2011. That is now coming true in pregnancy, promising to deliver a software- (NFV); software-defined storage (SDS); and The US government’s in far too many enterprises. ways he could not have predicted. 30 datacenterdynamics.com • July/August 2016, Issue 15 “cloud first” initiative SecurityIssue 15, July/August + 2016 Risk • datacenterdynamics.com 31 has boosted and secured the cloud industry Servers + Storage

22 Open-Source

Issue 15, July/August 2016 • datacenterdynamics.com 3 Meet the team

Peter Judge Max Smolaks Michael Hurley Sebastian Moss David Chernicoff Global Editor News Editor Reporter Reporter US Correspondent @PeterJudgeDCD @MaxSmolaksDCD @HurleyMichael @SebMoss @DavidChernicoff

Green Guru (Critical Captain Storage Watches developments Gets excited about Former CIO, test lab Environment). Also, open (IT & Networks). Also, in the public sector, as open source software, leader and developer. source, networks, telecoms, international incidents, well as power distribution security and high Our man in Philadelphia international news. data sovereignty. and outages. performance computing. gets his hands dirty.

Virginia Toledo Celia Villarrubia Michael Kassner Paul Mah Tatiane Aquim Editor LATAM Assistant Editor LATAM US Correspondent SEA Correspondent Brazil Correspondent @DCDNoticias @DCDNoticias @MichaelKassner @PaulMah @DCDFocusPT

Editor LATAM edition Assistant editor LATAM Our man in Minnesota. IT writer, also teaches Our Portuguese-speaking DatacenterDynamics. DatacenterDynamics. Fifteen years’ enterprise and tech in Singapore. Deep correspondent with an Breaking the molds. News and pithy opinions IT writing on technology, interest in how technology in-depth knowledge of Based in Madrid, Spain. in international edition. science and business. can make a difference. Brazilian public sector IT.

UNITED KINGDOM USA SPAIN SHANGHAI SINGAPORE 102–108 28, West 44th Street, C/Bravo Murillo Crystal Century Tower, 5/F, 7 Temasek Blvd CliftCliftonon Street 16th floor 178 – 2ª Planta Suite 5B #09-02A, London New York, 28020 Madrid 567 Weihai Road SuntecSuntec Tower One EC2A 4HW NY 10036 EspañaEspaña Shanghai, 200041 Singapore 038987 +44 (0) 207 377 1907 +1 (212) 404 2378 +34 911331762 +86 21 6170 3777 +65 3157 1395

ADVERTISING DESIGN FIND US ONLINE APAC Head of Design datacenterdynamics.com Vincent Liew Chris Perrins datacenterdynamics.es EMEA Designers datacenterdynamics.com.br Yash Puwar Fay Marney .com/DCDnews Vanessa Smith Holly Tillier Join DatacenterDynamics Global Discussion LATAM group at linkedin.com Daniel Clavero ASSOCIATE PUBLISHER Santiago Franco Ewelina Freeman SUBSCRIPTIONS USA datacenterdynamics.com/magazine Kurtis Friesen CIRCULATION Manager TO EMAIL ONE OF OUR TEAM Laura Akinsanmi [email protected]

© 2016 Data Centre Dynamics Limited All rights reserved. No part of this publication may be reproduced or transmitted, in any form or by any means, electronic, PEFC Certified mechanical, photocopying, recording or otherwise, or be stored in any retrieval system of any nature, without prior written permission of Data Centre Dynamics Limited. Applications for written permission should be directed to Jon McGowan, [email protected]. Any views or opinions expressed do not necessarily This product is represent the views or opinions of Data Centre Dynamics Limited or its affiliates. Disclaimer of liability: Whilst every effort has been made to ensure the quality and from sustainably accuracy of the information contained in this publication at the time of going to press, Data Centre Dynamics Limited and its affiliates assume no responsibility as to the managed forests and accuracy or completeness of and, to the extent permitted by law, shall not be liable for any errors or omissions or any loss, damage or expense incurred by reliance on controlled sources information or any statement contained in this publication. Advertisers are solely responsible for the content of the advertising material which they submit to us and for ensuring that the material complies with applicable laws. Data Centre Dynamics Limited and its affiliates are not responsible for any error, omission or material. Inclusion PEFC/16-33-254 www.pefc.org of any advertisement is not intended to endorse any views expressed, nor products or services offered, nor the organisations sponsoring the advertisement.

4 datacenterdynamics.com • July/August 2016, Issue 15 Small wonders %

hy does the planet’s biggest engineering project require tiny

data centers? Last month we 1 marvelled at the giant webscale data centers that power the cloud. This time around (p18) we lookW at the micro data centers that are needed to make sure it can reach all the places it has to. The owners of giant data centers can consolidate workloads and gain from economies of scale, but others don’t get that opportunity. They have to build and deploy small units of networked resources, in remote Annual growth sites they can’t control and can only visit rarely. in energy The edge is breeding a new kind of micro data demand of center. We expect to see these units adopt standardized, large US pre-configured hardware, take whatever kinds of data centers, power and cooling are available, and colonize the 2010 – 2014 edge regions like armies of ants. (LBNL / DOE)

North of the rack, meanwhile, there’s a different See p25 geography. For one thing, it’s virtual. “North of the rack” means the software layers that increasingly control and manage the pools of virtualized computing, network and storage resources, as well as the underlying power and cooling that sits, metaphorically, south of the rack. North of the rack is the sphere that is defining and We expect to delivering the software-defined data center (SDDC). The action here is in making self-service see the edge provisioning for resources we can understand, and colonized with the leaders are delivering this through a barrage of open source projects. We honour some of the leading standardized innovators in the area on p28, and follow that (p30) with an explanation of just what SDDC is – and why hardware units it ignores the underlying mechanical and electrical infrastructure at its peril. like armies of ants In the United States, big data centers have their power demands under control (p25), contrary to the oft-repeated panic over energy-hogging sites. We also hear how the US cloud industry is benefiting from a government campaign (p22) that is defining ground rules to make everyone’s cloud more secure.

With political meltdown all around us, Max’s back page article (p34) reassures us that our data center might one day be the only safe place left. Will you one day have to retreat there, prepared to eventually repopulate what is left of our planet?

• Peter Judge – Global Editor @PeterJudgeDCD

Issue 15, July/August 2016 • datacenterdynamics.com 5 WE ARE TAKING ENERGY FORWARD. SOLAR IS HOW WE GET IT DONE.

Today, a growing business needs more energy and a smaller carbon footprint. At First Solar, we deliver clean photovoltaic (PV) power that runs everything from new data centers to industrial facilities at a predictably affordable cost. Leading companies know this renewable energy protects profits from fossil fuel pricing and looming carbon regulations. That’s why solar makes smart business sense. And with over 13.5 gigawatts installed worldwide, nobody does it better than First Solar.

Partner with the global leader in PV energy. Visit firstsolar.com/DCDMag

FirstSolar_AD_DatacenterDyanmics_06MAY16.indd 1 6/6/2016 11:21:37 AM News Roundup

Microsoft to buy LinkedIn Microsoft is set to acquire professional networking site LinkedIn for $26.2bn, 49.5 percent above its market value. Microsoft has promised the social network will retain its own brand and independence, but there’s no comment on the future of LinkedIn’s internal data center technology. deliver service to its East Coast Apple pumps Fortune 1,000 clients. sewage The Kincora project was Apple will install a CyrusOne builds established to develop retail and water-treatment plant commerce space in Loudoun to cool its Prineville, County. It includes upgrades Oregon, data center 30MW North to transport infrastructure with recycled water. and construction of more than The plant will create four million square feet of new clean water from Virginia data center office space, as well as restaurants, sewage and save five shops and a hotel. million gallons of tap The site north of Washington water per year. in six months Dulles International Airport was abandoned for several years Equinix takes CyrusOne has gone from ground- said Gary Wojtaszek, president and because of the dire state of traffic China partner breaking to completion of a 30MW CEO at CyrusOne. “We never want conditions at the confluence of Equinix is teaming up data center in 180 days, beating our customers’ business objectives the Route 7 and Route 28 roads. with Datang Telecom records for similar-sized facilities. to be slowed down by the speed of The project was only resurrected Group in China. The Sterling II data center was their data center deployment.” when the Commonwealth of The state-owned built on the company’s 129,000 And it doesn’t stop there: the Virginia government, through telecom equipment square feet Northern Virginia company has purchased 40 more the Commonwealth Transportation manufacturer will campus using ‘massively modular’ acres of North Virginia land at Board, authorized an $80m become the service engineering methods. the 154-acre mixed-use Kincora loan, Loudoun Times reported. delivery partner for “CyrusOne’s goal has been to development, with a view to building CyrusOne runs 33 data centers Equinix in the country, improve our supply chain efficiency its third data center in the region. worldwide – most of which offering products to the point where we can deliver a The American IT infrastructure are located in the US, with the such as Equinix Cloud completed data center in the same provider, a public corporation, exception of sites in London Exchange to businesses timeframe that our customers can said the acquisition will bring in and Singapore. in the region. order and receive the computing more than $1bn in investment to equipment that will reside there,” the county and help the company http://bit.ly/29ot8xS VOX BOX / DCD VIDEO

Can service assurance software What is happening in the replace IT professionals? Chinese market? Counterintuitively, no. We are China probably represents the empowering the cloud infrastructure fastest-growing data center market admins with cross-silo intelligence. in the world. There’s a culture of Normally, if an end user complains about people doing new things who want to learn about how things have poor performance, you might point been done in the past, but at the the finger at the application provider, same time have an open mind about but often it’s not related to that. It’s doing things better, which to me is related to something like storage IOPS a perfect mixture. The Chinese way latency. We are making the data center of doing things is a good match to Atchison Frazer admin’s job more strategic. Scott Noteboom open source culture. CMO CEO Xangati http://bit.ly/29bTkdG Litbit http://bit.ly/28Y06km

Issue 15, July/August 2016 • datacenterdynamics.com 7 News Roundup

Coolan finds Amazon Web Services invites server power customers into Mumbai region going to waste Following years of speculation, Amazon Web Services has finally come to India, with a new infrastructure region served from data centers in Mumbai. It’s the sixth AWS region in Asia Pacific (APAC), and the 13th worldwide. “Indian startups and enterprises have been using AWS for many years – with most Indian technology startups building their entire businesses on AWS, and numerous enterprises running mission-critical, core applications on AWS,” said Andy Jassy, CEO of AWS. The expansion is part of a $5bn investment that Amazon is making in India – which includes its largest software engineering and development center outside of the United States – located in Hyderabad. India’s rapidly developing economy has resulted in high demand for colocation and cloud services; according to Gartner, the Indian data center market will be worth $2bn in 2016. Data center servers often have AWS says that more than 75,000 India-based customers are already power supplies that are too big for using other AWS regions, including the likes of NDTV, Tata Motors, the job, resulting in wasted expense Bombay Stock Exchange, Infosys and Cognizant. From today, these and inefficiency, according to live companies will have an option to host their data closer to home. data aggregated by Coolan. Initially, the new AWS region in Mumbai offers two The startup crowdsources live Availability Zones, which means the company is running at performance data from data center least two geographically separate data center sites in the city equipment, and a major study just with independent power, cooling and physical security. revealed that a single cloud provider Some previous reports suggested that AWS is planning to could save thousands of dollars build as many as five data centers in India. by specifying power supply units more accurately. Coolan’s software http://bit.ly/296XFN0 analyzed the power used by 1,600 nodes and found that most of the servers’ power supplies were running at only around 30 percent utilization, moved to Russia after he revealed the data where efficiency is much lower. Russia’s anti-terror collection practices of the NSA, tweeted “This study was the first time that “Russia’s new Big Brother law is an we used Coolan’s data collection law now requires unworkable, unjustifiable violation of rights in a production environment, that should never be signed.” He added that and it demonstrated a customer’s web firms to store the volumes required would make the law inefficiencies,” said CEO Amir impractical, as well as dangerous. Michael. “We found that power data for one year Russian mobile network operator supplies put in for servers are MTS said that six months’ worth typically rated for a much higher Amendments to Russia’s anti-terrorism of data storage will cost the output than the server needs.” laws will require significant storage company 2.2 trillion rubles Most common power supplies facilities for the country’s internet services (USD$33.8bn). Spokesman have a power-efficiency curve, and telecoms providers. The laws made it Dmitry Solodovnikov told where they are at their most through the lower house of parliament with Kommersant (via the Moscow efficient at the highest levels a vote of 325 to 1, and are expected to pass the Times): “Our taxable profit of output, where efficiency is upper chamber easily, for eventual signature for 2015 totaled 22.5bn rubles around 95 percent. In the data by President Vladimir Putin. ($364m) and income tax totaled Coolan gathered, it turns out that Russia has previously made efforts to 4.5bn rubles ($69m). most of the power supplies were force companies to store data within the Taking the expenses operating at around 30 percent of nation’s borders, but the new law will [on data storage] their maximum output, where the additionally require companies to keep into account, we efficiency is five percent less. data they would normally delete. The rules won’t be able to The extra power that is used is state that “the organizers of information pay taxes on profit for turned into heat, and this has to distribution on the internet” will be required about 100 years, and the be removed by cooling systems, to store metadata for one year, with state budget will not receive so the actual amount of wasted regulators picking which companies they 450bn rubles ($6.9m).” energy is even higher. define as internet organizations. Search giant “Our client was leaving more Meanwhile, telecoms companies such as Yandex also came out than $30,000 on the table every Megafon, Beeline and MTS will have to store against the ruling, saying year,” said Michael. “The amount records of all calls and text messages for six that the amendments will lead of cost-savings could easily months, and all metadata for three years. to the “excessive limitation of the rights approach hundreds of thousands The new rules have been criticized, not of the companies and users,” as well as of dollars per year.” only for their privacy implications but also increasing costs. for the practical difficulties they create. http://bit.ly/292taNA Whistleblower Edward Snowden, who http://bit.ly/29bbqvm

8 datacenterdynamics.com • July/August 2016, Issue 15 News Roundup

a state IT departments expected. Just seven percent said Cloud shift of flux. were shrinking they were dissatisfied or very The due to budget dissatisfied with the service. accelerates demand pressures, Past surveys underestimated for agility advances in the speed of outsourcing Businesses are outsourcing to and cost hardware and because they omitted cloud the cloud at a pace faster than transparency outsourcing. resources commissioned behind anyone expected. has driven Approximately the IT staff’s back, sometimes Half of senior enterprise IT workloads to the half of enterprise IT called “shadow IT.” executives expect to move the public cloud. Our counsel to departments reported flat Uptime has advised internal majority of their IT workloads data center and IT professionals or shrinking budgets, and 55 IT departments to emulate to the public cloud, or to is to become more effective at percent said their enterprise- service providers by building colocation providers, according articulating and showcasing managed server footprints services that are easy to use, to the Uptime Institute, the their value to the business,” said were also flat or shrinking. and to take up a greater body that administers the Uptime’s Matt Stansberry. At the same time, 50 role in directing corporate Tier I-IV data center reliability For the sixth edition of its percent of respondents said governance and evaluating certification. Out of these, 23 Data Center Industry Survey, they were satisfied or very security, costs and performance percent said they expect the Uptime spoke to more than satisfied with their primary of IT for the business – wherever change to happen by next year. 1,000 data center operators colocation provider, despite it might reside. “The shift is occurring, and and IT practitioners worldwide, the fact that 40 percent had our findings show an industry in and found that legacy enterprise to pay more than they initially http://bit.ly/28ThuMo

Singapore explores green 19% of firms options for tropical center expect to have all-flash storage in two Singapore will test a data center designed specifically for years; 90 percent already have flash tropical climates, as part of a bid to drive innovation and storage in their data centers explore new green technologies. Initiated by the Infocomm Development Authority of Singapore (IDA), the Tropical Data Center (TDC) project will be set up in partnership with hardware makers and industry experts, and conducted in a test environment at a Keppel Data Centre facility. So far, vendors including Dell, Fujitsu, Hewlett Packard Enterprise, Huawei and Intel have signed up, as well as ERS, The Green Grid and Nanyang Technology University (NTU). The project aims to demonstrate that data centers can function optimally at temperatures of up to 38°C, Cut cooling and ambient humidity of more than 90 percent. According to the IDA, data centers are typically cooled to between 20°C costs by raising and 25°C, and kept to within 50 to 60 percent relative ambient humidity. data center Easing these limits could cut energy costs by up to 40 percent temperature by and reduce carbon emissions. Running data centers at higher temperatures is nothing new, but as much as such environments are typically ° built away from the tropics, 9 F or entail the use of air-side economizers in cooler climates. Details are still being worked out, but possible tests include controlling humidity but not temperature, or removing controls from both. The test RETHINK UPS will be set up in the third quarter of 2016 and will run simulated workloads for up to a year. Data centers accounted for seven percent of Singapore’s total energy consumption in 2012, and are projected to reach www.enersys.com/XE 12 percent of the country’s total energy consumption by 2030. © 2016 EnerSys. All rights reserved. Trademarks and logos are the property of EnerSys and its affi liates unless otherwise noted. Subject to revisions without prior notice. E.&O.E. http://bit.ly/29c1Bhz

Issue 15, July/August 2016 • datacenterdynamics.com 9 DCD Webinars DatacenterDynamics Webinars are a useful, interactive medium for companies to deliver audiences targeted messages that reflect sales and marketing objectives. To that end, DCD delivers bespoke content to webinar attendees with the support of an expert editorial team to help clients realize measurable payback on webinar campaigns.

Powering Big Data with Big Solar Tuesday, July 12 2016 (11am PDT/ 2pm EDT/ 7pm BST) REGISTER NOW Join industry expert speakers – First Solar’s Bill Thomas, Adam Kramer from Switch and DatacenterDynamics CTO Stephen Worn – to find out how cost-effective utility-scale solar options can support data centers in securing renewable supply. http://bit.ly/29sZ5CJ

FIND OUT MORE ABOUT OUR PREVIOUS DCD WEBINARS

//ON DEMAND //ON DEMAND //ON DEMAND

Smart choices for your Designing flexibility into your data Is hyperconvergence a viable digital infrastructure center power infrastructure alternative to the public cloud? Viewers learned how companies can create Viewers learned from our expert speaker, Our panel speakers – Eric Slack from a reliable, flexible and cost-effective digital Jawahar Swaminathan, best practices on Evaluator Group and Rich Kucharski from infrastructure. Expert insight came from creating long-term power flexibility and Simplivity – discussed ways to improve DatacenterDynamics’ senior global analyst increased energy efficiency. efficiencies and reduce operational Nick Parfitt, CTO Stephen Worn and Watch on demand here: complexity through hyperconvergence. Rob Cardigan from Nexans. http://bit.ly/1U7CNV8 Watch on demand here: Watch on demand here: http://bit.ly/1ZIfg2e http://bit.ly/1syLUb4

//ON DEMAND //ON DEMAND //ON DEMAND

Your “single pane of glass” Best practice in modular design SDN/NFV: The revolution is here. solution has arrived Are you ready? Starline’s Mark Swift joined our DCD Viewers learned how converging OT/IT webinar moderator Stephen Worn This webinar explores the use of software- data can give a granular insight for digital to discuss power management and defined networking and the belief that it infrastructures. Matthew Brown from distribution, as well as the pros and will deliver increased flexibility, reduced Hewlett Packard Enterprise, Gerry Lagro cons attached to both. cost and greater productivity. We hear from from OSIsoft and Brian Polaski from IDC research director Brad Casemore and RoviSys provided industry insight. Watch on demand here: Jeremy Rossbach from CA Technologies. Watch on demand here: http://bit.ly/1TwEOuD Watch on demand here: http://bit.ly/1qGGVE8 http://bit.ly/1qGGTfo News Roundup

Optical fiber link gives Nepal Cisco promises all data independence from India the analytics you China and Nepal have been connected by an optical fiber link, ending Nepal’s could ever want reliance on India for all of its networking needs. The cable goes through Geelong (Keyrong)-Rasuwgadhi, after a previous attempt through Tatopani-Zhangmu (Khasa) Cisco has launched Tetration, a real-time data failed due to the April 2015 Nepal earthquake. The new route will increase speeds center analytics platform designed to give visibility thanks to the shorter distance to the regional data center hub of Hong Kong, into “every packet, every flow, every speed.” and should create a competitive environment for services. The product uses hardware and software Nepal Telecom has also agreed with telecom operator China sensors to gather information and analyze it Unicom for further cross-border optical fiber connectivity. by applying machine learning. A landlocked nation, Nepal depends on good relations with Cisco says users will be able to “search across its two giant neighbors. The Indian border crosses low plains billions of flows in less than a second,” and have with numerous trade routes, and the Chinese border is in the what amounts to a “time machine for the data Himalayan mountain range, so 90 percent of Nepal’s trade center,” said CEO Chuck Robbins, allowing a passes through India. However, Prime Minister KP Sharma “rewind” capability that lets users replay past Oli has begun to shift Nepal towards a Sinocentric world view. events, plan for the future, and even freeze time to see what happened at a specific point. http://bit.ly/2953mhr The software sensors support Linux and Windows hosts, and hardware sensors are embedded in the ASICs of Cisco Nexus 9200-X and Nexus 9300-EX network switches. One Tetration appliance can monitor up to one million unique flows per second. The servers and switches are pre-wired and the Peak Hosting goes bankrupt software is pre-installed, but support is available. after losing Game of War http://bit.ly/29kX3ED

Oregon-based data center company Peak Hosting has filed for bankruptcy and laid off 135 workers after losing freemium mobile game developer Machine Zone (MZ) as a client. The company behind Game of War and Mobile Strike cancelled its contract after an outage took Game of War offline for two hours in October. The former partners have exchanged bitter words and are now suing each other: Peak claims that MZ still owes it $100m, while MZ has filed its own claim for $23m. With around 50 employees left, Peak says it is still a viable company after the loss of its main client, which represented 80 percent of its business. “It is a very sound and strong company now that it has been downsized,” said Peak’s newly hired chief restructuring officer Mark Calvert. Reduce However, the company noted that it will need funding for the litigation. Its bankruptcy filing lists assets of $100m-$500m and runtimes to less liabilities of $50m-$100m. Peak blames the outage that sparked the disagreement on a software bug in Cisco hardware. Peak also said that it spent $35m on equipment than 5 minutes to deal with MZ’s load, and that the game developer’s new data center uses proprietary Peak technology. or even as little MZ has quickly become a major player in the free-to-play mobile games space, with Game of War: Fire Age bringing in an estimated $550 as 30 seconds per paying user – more than that of any other mobile title, and well above the estimated freemium average of $87. The developer has spent more on television advertisements than any other gaming company, including a Super Bowl spot. http://bit.ly/1XtvVYN RETHINK UPS

www.enersys.com/XE © 2016 EnerSys. All rights reserved. Trademarks and logos are the property of EnerSys and its affi liates unless otherwise noted. Subject to revisions without prior notice. E.&O.E.

Issue 15, July/August 2016 • datacenterdynamics.com 11 News Roundup

Aegis plans to build with VR

Virtual reality could prove a handy tool in the planning and building of data centers, according to British colocation provider Aegis Data, allowing architects and data center operators to visualize the building before it is constructed. This year has seen the release of the Oculus Rift and HTC Vive headsets, and PlayStation VR is coming this October. At Google I/O, the search giant announced Daydream, a set of specifications to VR-optimize Android phones. “When acquiring a site or planning to upgrade an existing one, it is important for data center operators to visualize the space required,” said Greg McCulloch, CEO of Aegis Data. Aegis also foresees VR being useful in sales pitches, allowing companies to show customers a site “without leaving the confines of the office.” VR can also allow customers to see the location and security of their data, while exploring a site’s physical security measures. Meanwhile, VR use is expected to increase demand for data centers. The huge processing requirements of virtual reality could lead to more and more GPUs finding their way into data centers, according to Robert Scoble, while virtual reality has already become a dominant driver of new traffic growth in the UK.

http://bit.ly/298MLLS

AT&T abandons plans Crisis forces Rio to pull for ‘secret’ data center supercomputer’s plug

AT&T has abandoned plans to expand its Short Hill The Santos Dumont supercomputing cluster, one Mountain facility, after local protesters claimed the of the largest in Latin America, has been switched project was a data center in disguise. off since May, as the state government of Following complaints, the telecoms giant has Rio de Janeiro goes through a state of suspended its plans for the 160,000 square feet “financial calamity.” facility that would have required two million The cluster, based at the National gallons of water annually, and which AT&T Laboratory for Scientific Computing described as a “utility substation.” (LNCC) in Petropolis, opened in January AT&T’s principal technical architect Scott but has been disconnected for the Rushin informed Loudoun County’s planning past two months because the Rio state and zoning director: “We do not come to this government has concluded it cannot pay decision easily. The facility is a vital part of our an electricity bill that would amount to global telecommunications network… And, about $150,000 (R$500,000) per month. contrary to speculation, the site is not a data Compared with hosting the Olympic Games in center, and our planned upgrade would not a city that has warned it is facing a “total collapse have converted it into one.” in public security, health, education, transport and AT&T claimed the site would assist local environmental management,” an idle supercomputer broadband, but a report by the Loudoun County is a small worry. But local media reported that the Communications Commission concluded that the cost of the machine was dropped from an adjusted site would have had no benefit for the residents, budget by the state Ministry of Finance, and Augusto after conducting an evaluation that AT&T Gadelha, director of LNCC, said the system could declined to assist with. not be paid for as its power demand would swallow The commission said that “no evidence was up 80 percent of the center’s remaining budget. found of AT&T planning or requesting a cable video The Santos Dumont cluster has a capacity of or wired internet access distribution network, 1.1 petaflops, delivered by three systems: the CPU, or wire center from the facility to residents or AU$1bn GPU and hybrid. The GPU alone can provide businesses generally in the county.” 456 teraflops and ranks 476th in the Top 500 It stated: “The Communications Commission Value of supercomputers in the world. therefore concluded that there was no evidence the Lockheed Martin At the moment, the system is said to be switched facility would serve county residents or businesses consolidation deal off indefinitely, and questions have been raised over generally for either wired, wireless/cellular, cable or with the Australian whether the water-cooled system will deteriorate if broadband internet access.” left out of service for a long time. Supervisor Geary Higgins (R-Catoctin) Department of One of the tasks the supercomputer was commented: “This is, and has been, a government Defense supporting was a genetic mapping of the Zika virus, site up there since 1963.” which is currently a huge public health concern in Brazil in the lead-up to the Olympic Games. http://bit.ly/292wDvu http://bit.ly/293w9nt

12 datacenterdynamics.com • July/August 2016, Issue 15 RETHINK UPS. WE DID.

In line with the latest data center design requirements, you may be targeting high power, short duration UPS runtimes of under fi ve minutes or even less than one minute. The revolutionary DataSafe® XE battery from EnerSys® makes it possible – and practical. With unmatched power density, the result of Thin Plate Pure Lead (TPPL) technology and proprietary manufacturing processes, DataSafe XE batteries deliver the power you need, and the savings you want. To learn more and estimate your savings, visit www.enersys.com/XE.

› Run times from 30 seconds to 5 minutes + › Reduced electric spend for cooling in higher › Up to 25% longer battery life temperature operations › Less cost for replacements and labor › Up to 30% lower Total Cost of Ownership (TCO)

www.enersys.cowww.enersys.comm

© 20162015 EnerSys. All rights reservereservedd.. T raraddeemarksmarks and logos are the property of EnerSys and its affi liatesliates unless otherwiseotherwise noted.noted. Subject to revisionrevisionss withoutwithout prior notice.notice. E.&O.EE.&O.E.. Uptime Tier facts

• The Uptime Institute has awarded more than 800 certificates Bringing Tier IV • There are certified facilities in 74 countries • Tier I ensures basic to Australia facilities, Tier II mandates redundant components, Tier III needs concurrently Reliability is an important distinction for a maintainable systems, colocation provider. Micron21 tells Paul Mah Tier IV is fully redundant about the road to certification • Uptime’s M&O stamp evaluates procedures ier III doesn’t always James Braunegg, managing at facilities that are mean Tier III. All director of Micron21, is not your not Tier certified too often, data typical data center operator. He centers claim to started the business some six have the Uptime years ago, within a family-owned Institute’s reliability printing firm, using spare network Tcertificate without actually being connectivity and facility space to accredited. Last year, the Uptime offer hosting as a separate service. Paul Mah Institute responded, changing its SEA Correspondent Tier-rating system to make this Hosting grew, and in 2014 the kind of abuse harder. printing business was sold, freeing @PaulMah But that’s not a criticism that up additional space. In July 2015 could apply to Micron21, a vocal the decision was taken to grow supporter of accreditation, which is the data center footprint from its engaged in an upgrade that will give original 10 racks of capacity. The it Australia’s first Tier IV facility. expanded data center now offers Micron21’s data center in 100 racks with 2MW of incoming Melbourne is Tier III and Tier power, and all the redundant IV Design Documents (TCDD) power and cooling hardware certified, and the plan is to follow mandated by the Tier IV standards. through with a Tier IV Constructed Micron21 has an unusually dense Facility (TCCF) certification. deployment in its colocation facility,

14 datacenterdynamics.com • July/August 2016, Issue 15 Asia Pacific large building that supports a thousand racks. [What would have been] an AU$8m investment for me will turn into AU$80m,” explains Braunegg. “What I do have is revenue from my existing clients that is allowing me to tune our facility into the best in Australia. I’m providing myself 10 times the capacity – the existing clients alone justify the investments.” Being Tier IV certified will be a big benefit, says Braunegg: “It is a massive strategic advantage over everybody. To say that you are the only Tier IV data center in Australia is a massive boost to credibility.”

Braunegg advises businesses to challenge claims of Tier compliance by data center operators who say they are“built to” specific Tier standards, but are not certified. “Show me the certification, show me what you are saying is actually true,” he says. “Why not get certified? Unless each and every component has been designed, constructed and integrated correctly, then you are not [a Tier-certified facility].” serving a client base of 1,000 customers The journey to Tier IV began in from those 10 racks. By contrast, its earnest last July, and Braunegg estimates network is positively outsized: Micron21 that it should be completed within the is directly peered to around 1,600 next six-to-eight weeks. networks globally, which is astounding So what did Micron21 learn? “Tier for the company’s size. IV data centers globally are still quite Micron21 runs its own distributed rare,” says Braunegg. “There are denial of service (DDoS) protection assumptions that people make, but few service with 700Gbps of capacity, which people really know what is required.” is used by ISPs, content providers and One of the biggest misconceptions has hosting companies. “We were actually to do with mains power: the fallacy that selling services, says Tier IV data centers not space. I was must be connected to two more of a managed separate power grids. service provider. Tier IV data “Uptime Institute However, I controlled assumes you don’t have and owned the centers are mains power. The ability building, the still quite rare. to run your own data power, the cooling center is your own ability and the network,” It is a strategic to generate your own says Braunegg. power, redundantly,” he Micron21 has advantage to says. “[Uptime] don’t test customers ranging be the only your data center with from small and mains [power]; they test mid-sized businesses one in the the mains failing.” to Australian Building a Tier IV government country data center is clearly departments needing not for everyone, even if fault-tolerance. Micron21’s high customer density makes full redundancy logical. “It is expensive Braunegg expanded and upgraded to build a Tier IV data center if the data his existing data center with zero center has one customer per rack. I have downtime for existing customers. That’s many customers per rack; hundreds an impressive achievement, but why not of businesses potentially. That makes build a new data center instead? The redundancy important,” he says. process of upgrading is often said to be “It is an amazing journey – an extreme akin to rebuilding the engine of a car learning curve – but the benefit is that while it’s still in motion. “For me to go our customers know what we have built is buy another building, I want to buy a world class, and an Australian first.”

Issue 15, July/August 2016 • datacenterdynamics.com 15 uild a strong network of Tier III data centers in Colombia to meet the country’s growing demands for outsourcing, colocation, business Strength in continuity and disaster recovery – that’s the mission Zona Franca Bde Bogotá (ZFB Group) set itself, and that’s why ZFB is opening two new data centers in 2017, to Celia Villarrubia join the two it already has in Bogotá. Assistant Editor Colombia ZFB has allocated around $40m for the new LATAM ZFB Group will add two new data facilities – one in Zona Franca del Pacifico (the Colombian Pacific free trade zone) in Cali and @DCDNoticias centers in 2017 to the two it already has the other in the Zona Franca de Tocancipá (the in the capital. Celia Villarrubia reports Tocancipá free trade zone), 39km from Bogotá.

Powering Information Empowering Generations

SMART SWITCHGEAR & DATACENTER SOLUTIONS

LEARN MORE ABOUT THOMSON POWER SYSTEMS ADVANCED TECHNOLOGY WWW.THOMSONPS.COM 1.888.888.0110

16 datacenterdynamics.com • July/August 2016, Issue 15 Latin America The projects are in the design phase, and it is 15 percent of the area will be aimed at a higher expected that construction will begin in the density – around 6kVA per square metre. second half of 2016, with the centers up and With power from three electrical running, providing service to customers in substations, the Pacific free trade zone site will the second half of next year. have plenty of redundancy in the provision “In Colombia, customers are increasingly of electrical service. “Right now we are doing aware of the importance of getting electrical adjustment for the data center to information in real time and keeping it have a 34.5kV line. We will start with a line that permanently, using data centers that are run will have 6MW of power available and, to the by professionals with appropriate procedures extent that the data center requires, we will and certifications,” says Juan Pablo Rivera, increase the power,” says Rivera. president of ZFB Group. The climate of Cali does not allow free cooling in the free trade zone, but ZFB Group There is big demand for business is considering cooling the site using water continuity services from companies in the resources that abound in the area. Outside free trade zones, and ZFB has decided it air cooling works fine in the Tocancipá free needs two new data centers to meet demand trade zone. That site also has a 34.5kV line for – providing business continuity through the data center to use. With these measures, the Tocancipá free trade zone and disaster the company expects PUE levels below recovery from the Pacific free trade region. 1.6 for both of the new installations. When the new data centers are fully operational, ZFB Group will have 4,080 As far as availability is concerned, square metres of white space, three times ZFB Group aims to certify both data the space it currently has for data center centers as Tier III, just as it has done with services. The existing data centers are located its previous data centers. The facilities at in the ZF Towers and the Bogotá technology Bogotá Park Towers already have Tier III park, and each one has 680 square metres certification for design, and the group plans of white space. The latest facility has been to achieve the Tier III constructed facility ZFB’s network running for three years and has just opened certification. The same process will be its second 340 square metre room, which has applied to Cali and Tocancipá. in 2017 improvements such as free cooling. In the future, ZFB Group plans to The two new data centers in Cali and deliver reliable certified infrastructure with • 2,720 square metre Tocancipá will each be twice the size of redundancy and security, and is looking new white space the Bogotá facilities, reaching 1,360 square beyond its current locations. Depending on metres of technical room each, available in how the market develops, the company is • 4,080 square metre incremental steps. “Next year we hope to offer considering other cities outside Bogotá and total white space 340 square metres of white space in each of also beyond Colombia, Rivera says. the new data centers; we plan to grow them To get a presence in other Latin American • PUE of less than 1.6 gradually, until they are fully occupied in countries such as Peru and Chile, the group about two-and-a-half years,” says Rivera. is considering the possibility of alliances with • $40m investment Built in parallel, the two new ZFB Group other infrastructure suppliers. The ultimate in two years data centers will be similar in design and result would be a regional network of Tier III capacity. Specifically, the electrical capacity data centers, added to the four data centers it will be 1.5kVA per square metre, while will have around Bogotá next year.

Powering Information Empowering Generations

SMART SWITCHGEAR & DATACENTER SOLUTIONS

LEARN MORE ABOUT THOMSON POWER SYSTEMS ADVANCED TECHNOLOGY WWW.THOMSONPS.COM 1.888.888.0110

Issue 15, July/August 2016 • datacenterdynamics.com 17 VIRTUALIZATION ENGINE

COOLANT MOISTURE SPOUT

BUBBLE ROUTER

ENCOURAGING EFFICIENCY BOOSTER

SUPERSONIC SHOCK SHOOT

EMERGENCY VENTIL.ATION The internet is getting bigger, but some data FUNNEL centers are getting smaller, reports Michael Hurley

Michael Hurley s the world around us Growth in the micro data center market, Reporter shrinks through the fuelled by the Internet of Things (IoT), will globalization brought on in drive it from $1.7bn in 2015 to $6.3bn in 2020, @HurleyMichael large part by the internet, according to a MarketsandMarkets report the computer networks published in October 2015 titled, Micro-Mobile sprawling out to support it are growing Data Center Market by Applications. in increasingly unpredictable ways. North America is expected to be the largest The internet has become huge, and market in 2020, with developing countries PUE applications are ever more data-hungry. The such as India, China and Brazil growing fast. BUSTER DATA COIL data center industry’s answer to that challenge As well as the IT stack, these units THRUSTER is counter-intuitive: miniaturization. include on-board cooling, telecommunication Micro data centers – small, containerized, and storage systems, security, fire suppression prefabricated systems that hold all the and UPS. They can be custom-built – from a components of a traditional data center single rack to multiple racks – depending on POWER in a compact unit – are increasingly the customer’s requirements, and have the MATRIX being distributed to meet computing benefit of minimizing a data center’s physical demands at the edge of networks, in footprint and the energy consumed, when second- and third-tier cities. compared against the traditional model.

18 datacenterdynamics.com • July/August 2016, Issue 15 Cover Feature

VIRTUALIZATION ENGINE

Although solutions built in shipping broadcaster’s IT infrastructure Small systems are also useful containers often fit the micro data center and established that a distributed in retail environments, where stores definition, in recent years such is the extent network of data centers want to measure footfall by of the shrinkage that they may reasonably be would provide a more installing Wi-Fi networks COOLANT said to have been edged out of qualification. resilient service that and tracking shoppers’ MOISTURE SPOUT The Mobyl Data Center, for example, could be properly tailored movement patterns. These manufactured by Arnouse Digital Devices to users’ demands. data requirements aren’t Corp, fits credit card-sized servers into a Schneider Electric’s latency sensitive but a suitcase-like container originally designed modular data center micro data center proves for the US Department of Defense. deployment to the size of micro data more economical. Made available for commercial applications Sagrada Familia church center market in 2020 Petrochemical firms at the request of Wells Fargo Bank, the in Barcelona won a (MarketsandMarkets) need to process large Mobyl Data Center is being used in special DatacenterDynamics amounts of geological data to configurations as communications servers award in November 2015 zero in on new sources of gas and, in one instance, to run an auditing for meeting the site’s and oil. Often, this processing BUBBLE ROUTER system for criminal investigations. computing infrastructure requirements needs to be done rapidly on-site – an ideal “I don’t think this is an overnight move,” inside 16 weeks. task for which a lightweight, portable data says Tanuja Randery, president for UK and Alongside speed of deployment, other center is suited. In other cases, latency ENCOURAGING Ireland at Schneider Electric. “People have benefits of micro data centers include lower may be very important. For instance, army EFFICIENCY not stopped investing in centralizing data operating costs over time and the ability to operations in war zones might deploy a BOOSTER centers – you still need places where data is tailor them to the customer’s requirements. ruggedized micro system. u SUPERSONIC following a hub-and-spoke approach. But SHOCK what we have been seeing happening over SHOOT the past five-to-seven years has been a shift towards people recognizing that a distributed environment is more efficient and effective.”

“The BBC reviewed its infrastructure and established that a distributed network of data centers would provide a more resilient service that could be properly tailored to users’ demands.” EMERGENCY VENTIL.ATION FUNNEL As a cooling and power provider, Schneider has long served giant facilities, but recently acquired APC, whose modular division gave it the technology it needed to go after the micro data center space. Randery says the proliferation of devices that populate the IoT, particularly sensors and meters in the industrial environment, has been a key driver of growth in the market: “More data is now being created at the edge on our smartphones, by all the PUE sensors, meters and industrial processes. BUSTER DATA COIL The industrial IoT is a bigger phenomenon.” THRUSTER These devices feed data into the system and increase data processing requirements on the edge of the network. They drive the need for data to be stored on the edge POWER both for easy access and also to prevent it MATRIX travelling long distances, she says. The BBC’s chief scientist, Brandon Butterworth, recently reviewed the

Issue 15, July/August 2016 • datacenterdynamics.com 19 u In the financial technology sector, the time in various chunks of South America, it takes to deliver data is similarly important, Longbottom says, where these small where traders need their data center close autonomous systems can be deployed rapidly to the trading environment. and without incurring the cost of hiring and In the old days, a bank might have to installing staff possessing the necessary skills. decide whether to invest $500m in a new data A business might simply need to expand center, based on a forecast of demand little capacity as it has run out of space in its primary better than a ‘finger in the air’. Now smaller facility. In this eventuality it would be cheaper data centers can be installed that are as good to invest $500,000 in a small system than as the large sites in terms of availability, $50m in a new data center, Longbottom says. flexibility, cooling and sustainability, but US video streaming service Netflix used are much more economical, says Randery, micro technology to its advantage when “and can be tailored so you aren’t required to it observed that its US Midwest services buy something you don’t need.” were being underused when taking into consideration regional population size and demographics. Hyperconverged infrastructure “Netflix observed that its US Midwest “Latency between the Midwest and systems can be a particularly services were being underused, taking the main data centers was too much. The good base for the construction into account regional demographics. time it took to download a Netflix film of micro data centers as It installed a micro data center that was too long, trying to watch something they provide all the essential provided local data caching and found live was impossible,” Longbottom says. building blocks likely to be that all of a sudden usage went up. “They installed this micro data center, required in their different This justified millions of dollars of used data caching and found all of a use cases. Vendors such as investment in a new data center.” sudden that usage went up. They could Nutanix, Scale Computing say it is now worth our while putting in and Simplivity led the way, millions of dollars to build this center in with mainstream vendors Market analyst Clive Longbottom, director this environment and move that micro such as HP, Cisco and Dell and founder of research company Quocirca, center to test demand in the Deep South.” legitimizing the trend by says potential uses for micro data centers In the future he envisages a ‘meshed offering their own kit. include when a business, in setting up a new system’ developing to address IoT latency office far from a central data center, needs issues, where a micro data center located Vendors offer pre-configured to install a small system within the office close to the network edge operates as a stacks of hardware. These can to support its operational requirements. caching area, but with a central data center have a small footprint and use Companies might also want to deliver still controlling traffic from afar. less power than equivalent compute capacity in remote locations that “We are seeing something the size of stacks constructed from lack either technology or staff with technical Paris being built on an annual basis,” says different vendors’ kit. expertise, or both, where the infrastructure Randery. “If you think about the kind of can be operated from a distance. explosion that is going to happen, we need However, the major benefit These locations might include the ways for us to be able to deal with that data for micro data centers is the Australian Outback, central India or and improve the customer experience.” fact that these systems can be delivered quickly and installed without specialist technical knowledge. They are also designed to operate remotely and autonomously, with BUG control from a single screen, ELIMINATOR all of which is important for systems that are being deployed in large numbers across wide geographies.

HARMONIZATION These modular systems can MODULE THE INTERNET scale out with the addition of OF THINGS supplementary modules, so performance can be improved when and where extra capacity is required. Some pre-manufactured systems can prove rigid, however, in that if additional storage is required, additional compute must be added at the same time. Build-your-own platforms, such as VMware’s VSAN, often allow granular tweaks to either storage or compute. PRO

Learn the language of the data center

Take DCPRO’s Data Center Design Awareness Course

Ensure Your Spot – Book Now www.dc-professional.com [email protected] or call +44(0) 20 7377 1907

Use the promo code ‘DCDA’ for a 5% discount on our Data Center Design Awareness Course. DCProfessional Development provide scalable eLearning and face-to-face data center training for the global mission critical sector. Courses are developed and delivered by an international network of industry experts using the latest educational techniques. Globally recognized and accredited, training addresses the real world tactical demands faced by data centers.

T: +44(0) 20 7377 1907 | E: [email protected] | W: www.dc-professional.com How the feds helped build the cloud

The US government’s “cloud first” initiative has boosted and secured the cloud industry, says David Chernicoff

and Budget’s Office of the CIO (OFCIO). cloud services companies implement these, Agencies will be required to use the following it will go a long way towards improving their David Chernicoff guidelines, in the order listed below, as part overall security. It will also help in the market US Correspondent of their consolidation efforts: as, without question, security is the number 1. Transitioning to provisioned services, one concern expressed by all customers of @DavidChernicoff including configurable and flexible cloud service providers. technology such as software-as-a-service (SaaS), platform-as-a-service (PaaS) and Given the level of effort that well-known he Federal Data Center infrastructure-as-a-service (IaaS), to the cloud providers such as Amazon, Microsoft, Consolidation Initiative furthest extent practicable, consistent and Google need to go through for FedRAMP (FDCCI) handed a mandate with the cloud first policy. accreditation, they are likely to take the to the federal government 2. Migrating to interagency shared services simplest route and apply technical security to reduce the costs and or colocation data centers. capabilities through all versions of their increase the efficiency 3. Migrating to better optimized data centers product lines, while adding government- Tof government-operated data centers. within the agency’s data center inventory. specific requirements to those that are How can private industry profit from offered to federal agencies. the government’s experience? This centralized clearing house for All of the most familiar cloud service Originally launched in 2010, the FDCCI consolidation efforts provides clarity, and providers have been FedRAMP-authorized for was signed into law in 2014 as the Federal the cloud first effort demonstrates demand. their core cloud services. But applications also Information Technology Acquisition The monetary value of the contracts see the benefit of the security compliance Reform Act (FITARA). In 2016 it was further being offered by the government for cloud audits necessary for authorization. For refined by the Data Center Optimization technologies could make or break even example, Microsoft Office 365 Government Initiative (DCOI). This sets out the “why a good-sized company, depending upon has passed through the security assessments and how” of consolidation efforts, and its ability to handle, or even be eligible, needed to allow it to be deployed at any emphasizes a “cloud first” approach to for the work. The commercial vendor agency. This means that not only can the government IT, which was first suggested community has responded in providing familiar Office applications be used, but also in the original FDCCI. appropriate cloud services. for Business, OneDrive for Business, By the end of 2016, as part of the DCOI, Meanwhile, the government has set and SharePoint applications are available, all there will be a freeze on new data centers or out security requirements under the Federal meeting the rigorous security standards. significant data center expansions without Risk and Authorization Management Any commercial customer can now have direct approval of the Office of Management Program (FedRAMP). As commercial the assurance that these applications have

22 datacenterdynamics.com • July/August 2016, Issue 15 Colo + Cloud been tested to this level of scrutiny. Customers “get once, use many” type requirement. security requirements for FedRAMP looking for identity management and single Once certified, specific services including IaaS, certification mean that an individual sign-on capabilities might be surprised to SaaS or PaaS can be included in any other customer need not pay for the development learn that Microsoft’s Active Directory for government agency’s plans, without the need of equally strict security themselves during Azure is also available. Other well-known to be re-certified for a specific deployment. deployment, whether a commercial or commercial products, Even though the government customer. The certification such as Cisco’s WebEx, total number of vendors having already been achieved means that have also achieved currently listed on the security was baked in. appropriate accreditation. With its FedRAMP website is well Even vendors that under 100, the impact of Companies have found that there is don’t come to mind stamp of these requirements on enough interest from commercial customers when looking for cloud the data center industry to make it worth asking for permission to services have been able approval, the cannot be understated. brand versions of their products as FedRAMP- to take advantage of government Security has been one approved. This has led to the US government providing services to the of the biggest stumbling to release a 14-page document outlining how government. Dell was has granted a blocks in general business to brand properly (http://bit.ly/29d7Hia). one of the first to begin acceptance of cloud It’s clear that the government initiative offering dedicated IaaS level of cachet services. While various has had a significant effect on the data services and now delivers to cloud standards for different center business as a whole. By issuing Dell Cloud for the US security models have rigorous security standards it gives both government (DCG). providers long been in place, the technologists and operators a solid target for Dell offers multi- FedRAMP guidelines, their own efforts, meaning that customers, tenant IaaS to US along with FISMA and both governmental and commercial, government agencies FITARA and, to a lesser can now expect a much more secure at all levels – from local to federal, higher- extent, the DISA SRG, have received a lot of environment by default. education customers, federal contractors, and attention from commercial customers. It has also changed the way that data other government organizations that require Data center operators who hadn’t center operators do business, impacting on the low and moderate levels of accreditation. previously considered the impact of the way they approach offering cloud services government efforts for consolidation to their customers. And, more importantly, The process of meeting FedRAMP and security have begun to see that the with its stamp of approval, the government requirements is a costly one for a company, FedRAMP standards set an effective has, possibly unintentionally, placed a so it makes sense to get as much return on minimum level of security. And there certain level of cachet on cloud service that investment as possible. For pursuing is an additional financial benefit that providers, which has raised the profile of government contracts, compliance is a might attract customers. The stringent the entire industry.

Issue 15, July/August 2016 • datacenterdynamics.com 23 SINGAPORE DATACENTER WEEK 13-16 SEPTEMBER 2016

www.datacenterweek.com

Celebrating one of Asia’s most important digital hubs

MAJOR EVENTS

DCD Converged SE Asia OCP Information Day Singapore F1 Grand Prix 14-15 September 15 September 16-18 September Marina Bay Sands Marina Bay Sands Marina Bay Sands

Principal Industry Partner Industry Partners Power + Cooling

Energy crisis averted

New research reveals that data centers are not the energy hogs we feared, Peter Judge discovers FALSE ALARM?!

Peter Judge Global Editor

@PeterJudgeDCD

ave you heard how data century, center energy use is climbing by rampaging out of control? 90 percent We’ve heard that tale too. between And it’s not true. 2000 and A US report has found 2005. In 2007, Hthat even though demand for data center a federal report services is going through the roof, the energy raised concerns they use is increasing very modestly – and it that this would could be set to go down in the near future. be unsustainable in Meanwhile in Europe, data center markets the long term. including Sweden, and the UK colocation In 2011, a report from Jon sector, have produced solid data to show Koomey showed that energy use their energy use is under control too. was still increasing, but found only How can this be? It’s true that demand for a 26 percent rise between 2005 and 2010. data center services is exploding, but it seems Koomey said that data centers were becoming that increases in efficiency have kept pace, more efficient, but there was also a recession so growth in data center power demands has depressing demand, so many expected energy slowed. “Demand for computations and the use to bounce back and surge higher. amount of productivity performed by data centers continues to rise at substantial rates,” The new report finds that data centers says the US report, produced by the Lawrence in the US consumed 70 billion kWh in 2014 Berkeley National Laboratory (LBNL), but it – a figure only four percent higher than adds: “Technology advancements have made the usage in 2010. So energy use is growing IT equipment more efficient by being able to by one percent per year, even during a perform more work on a given device, while period when data centers boomed as online other design and management efforts have services expanded quickly. made the industry more energy efficient.” As an illustration of how well these efficiency Historically, it seems that data center trends are delivering, the report’s main graph energy use grew very fast at the start of this includes a dotted line, showing u

Issue 15, July/August 2016 • datacenterdynamics.com 25 200 2010 Energy Projected total Efficiency US data center 180 energy usage Current 160 Trends

140 Improved Source: Management Lawrence 120 Berkeley National Hyperscale Shift Laboratory 100 IM + HS 80

60 Best Practices 40

Annual electricity use (billion kWh/y) BP + HS 20

0 2000 2005 2010 2015 2020

that might be proposed in response to fears of data center energy use. For instance, regulators in many countries may be planning Doing the math to regulate or modify how data centers How did the LBNL report calculate operate in a bid to trim their energy use and US data center energy usage? enable them to curb emissions. Here’s a simplified run-through: One such initiative in the UK actually • Researchers used figures from provided solid data to show that things IDC on how many servers, are moving in the right direction. The UK network ports and storage units has a strong colocation sector – currently are installed, corrected to take the largest in Europe – and the British account for the “unbranded” government became concerned about its equipment used in webscale data energy use. Wanting to ensure that it was centers, as well as equipment playing its part to meet climate change that had been retired targets, it looked for ways to cut data • They sorted the equipment types center energy use but saw a danger that into 11 categories of data centers facilities might move abroad if hit by a • They calculated power use based punitive tax on energy consumption. on average utilization in each u how much energy would be used if the Instead, the government set up a category and multiplied that country’s facilities stayed at 2010 levels of Climate Change Agreement (CCA), whereby by the PUE of the data center efficiency. It shoots into the stratosphere, colocation data centers could win a reduction categories where they were. showing they would burn some 600kWh in energy tax if they filed detailed reports more over the decade if they weren’t adopting of energy use and showed a collective new ideas. But it’s possible to go even beyond improvement in power usage effectiveness this figure, the report says, by more rigorous (PUE). “We have robust data on the UK It’s often suggested that end-user application of best practices and the use of colocation market because everyone who is consumption of online entertainment is management technology. anyone in that market participates in the CCA growing out of control, or that the network that requires detailed data on energy use,” elements of the cloud are increasing to The report also proposes that, if Emma Fryer of TechUK told a meeting overtake the centralized data center functions. the industry moves to the cloud in Brussels in June. The UK’s In fact, Malmodin argued that network aggressively, something it colocation sector consumed equipment energy use is under control, and describes as the “hyperscale 2.15TWh of electrical energy the power demands of media consumption shift”, even more during 2015, with maybe are actually falling as people switch their consolidation could be Arman Shehabi another 0.25TWh burned consumption to smaller screens such as achieved and US data will discuss the US data by the providers too small tablets and phones, and more energy-efficient center energy usage to register in the CCA. devices such as flat panels. could actually go down center energy use report The same is true Emerging markets may sometimes be by 33 billion kWh per year at DCD Webscale in of other countries, expanding with older versions of technology, by 2020 – a saving of 45 San Jose, California, and it appears to apply and this could contribute some increased percent compared with the on July 19-20 to other parts of the energy usage, but in many cases they suffer position predicted by current IT universe. In Sweden from a more unreliable power network, giving efficiency trends. data centers are delivering them more incentive to save energy – and This sort of result is massive increases in computing they can do this by jumping straight to more instructive in showing the way for minimal energy increases, Jens energy efficient operations. to reduce data center energy use. It’s also Malmodin of Ericsson told the same meeting, Overall, there’s plenty of evidence that important, in providing actual data that can which was hosted by the Europewide we don’t need to panic about data center head off any well-meaning but hasty moves industry body Digital Europe. energy use.

26 datacenterdynamics.com • July/August 2016, Issue 15 CELEBRATING EMEA 2016 10 YEARS

WEDNESDAY DECEMBER 7 | LONDON HILTON HOTEL ON PARK LANE

Thanks to everyone who has entered, judged, sponsored and attended the gala ceremony over the past 10 years. With your help we have built an Awards programme which is widely recognised as the most important in our industry and we are proud to have taken this journey with you.

Join us again in this milestone year. Enter your best projects, host your clients and celebrate with us at the biggest ever DCD Awards Gala Ceremony on December 7th at the London Hilton Hotel on Park Lane.

+ + 1,500 1,000 300 123 Bottles of Submitted Hashtags on Winners champagne projects social media crowned opened

+ +

Christmas crackers opened + 4485 500 160 3000 £199,700 Guests attended Judges Sponsors Raised for children’s Charity

2011 2015 2007 2009 Liam Newcombe 2013 Ian Bitterlin Don Beaty Dr Roger Emma Fryer Schmidt

Outstanding Contribution Winners

2008 2014 2010 Ken Brill 2012 Paolo Bertoldi Peter Gross Harkeeret Singh 10 INNOVATORS NORTH OF THE RACK

As software takes control of the upper layers of the data center, we profile the leaders of the revolution

AMIR CHAUDHRY Unikernels Amir Chaudhry is one of the founding fathers of unikernels – single-address space machine images that minimize the footprint of cloud services, ensuring fast startup times, lower computing bills and more responsive infrastructure. Chaudhry founded Unikernel Systems and now works at Docker, which acquired his company and set him to work on all things unikernel-related. MARTIN CASADO DOUG CUTTING OpenFlow Hadoop In 2005 at Stanford, Martin Casado wrote An open source icon, Doug Cutting created a post-graduate thesis entitled, The Virtual search indexer Lucene and co-created web Network System, which kicked off the software crawler Nutch before making the wildly defined networking (SDN) revolution. The successful Apache Hadoop. Conceived as a way fundamental idea is to separate the control to process large data sets using commodity features of networking from the data, allowing hardware, Hadoop distributes both processing functions to become portable. Casado went and data. It has become the world’s leading on to create one of the defining protocols of platform for Big Data analytics and spawned SDN, OpenFlow. In 2007 he co-founded the a number of other tools. After working on SDN-focused company Nicira Networks, which Hadoop for years at Yahoo!, Cutting became chief created proprietary versions of OpenFlow, architect at Cloudera, a company that provides Open vSwitch and OpenStack, before being Hadoop-based software, support and services, sold to VMware in 2012 for $1.26bn. which now has more than 2,100 partners.

SCOTT GUTHRIE Microsoft Azure Co-inventor of ASP.NET, Scott Guthrie took over Microsoft Azure last year, investing heavily in maintaining the cloud system’s strong market position in a competitive field. Azure now spans 32 ‘regions’ worldwide, with each region composed of multiple data centers. Guthrie has worked hard to attract developers to his platforms and earlier this year wrote code live on stage to appeal to AzureCraft attendees.

28 datacenterdynamics.com • July/August 2016, Issue 15 Issue 14, May/June 2016 • datacenterdynamics.com 28 Servers + Storage

BENJAMIN HINDMAN Mesos Benjamin Hindman co-created open source cluster manager Apache Mesos while at university in 2009. It has gone on to be used by more than 50 organizations, including Twitter, and Apple, and is seen as a main contender for handling and delivering software defined data centers. Hindman, who worked at Twitter, went on to co-found Mesosphere, where he works on helping enterprise customers make the most of the Mesos manager he created.

ALEX POLVI CoreOS When Alex Polvi was 25 he sold his cloud- based server infrastructure monitoring company, Cloudkick, to Rackspace for an estimated $30m. Two years later he was back with something even bigger – an open source lightweight operating system called CoreOS. Developed by a company bearing the same name, CoreOS has quickly achieved success. With Polvi as CEO, it recently announced a partnership with Intel to containerize SAGE WEIL OpenStack, as well as releasing the prototype Ceph version of its new open source distributed Founder and chief architect of distributed storage system, Torus. storage platform Ceph, Sage Weil has always had a huge influence on the open source scene. Weil worked for two years as founder and CTO at scale-out open source storage SOLOMON HYKES systems provider Inktank, which was then Docker bought by Red Hat for $175m. Growing up as a coder and running servers as a teenager in France, Solomon Hykes co- founded platform-as-a-service firm dotCloud in 2010. While dotCloud found moderate success, its greatest claim to fame is being the birthplace of an internal open source project that Hykes started, which became Docker – a system for managing applications deployed inside software containers. In 2013, Docker was spun out into its own company, MIKE RHODIN with Hykes as the CTO. Fortune quickly IBM Watson followed as Docker was adopted by many Since its Jeopardy! win in 2011, Watson of the biggest companies. It has now gone has remained one of the most exciting mainstream and is set to be installed on every AI projects around. Led by Mike Rhodin, MONTY WIDENIUS HPE server the company produces. Watson has successfully transitioned from MySQL, MariaDB a research initiative into two businesses Michael ‘Monty’ Widenius is the main – Watson and Watson Health. In 2014, author of MySQL, the open source relational IBM announced it would invest $1bn database management system he released in in Watson, and in 2016 Watson Health 1995. Sun Microsystems bought MySQL for completed its fourth acquisition with $1bn in 2008, providing one of the first great the $2.6bn purchase of Truven Health legitimizations for open source in business. Analytics. Now head of Watson Business In 2010, Sun in its turn was bought by Oracle, Development, Rhodin says he is “developing and Widenius grew disenchanted with the the next Watson Industry business units direction of the product. He built his own as we transform IBM into a cognitive fork of MySQL, named MariaDB, which he solutions and cloud platform company.” hopes will ultimately replace his first creation.

Issue 15, July/August 2016 • datacenterdynamics.com 29 The road to software-defined

The software-defined data center movement ignores the critical environment, says Bruce Taylor, and that’s definitely not SDDC

Bruce Taylor EVP, North America

oftware is becoming ever more the data center as an abstracted private sophisticated, adding performance cloud, available to multiple customers, with analytics to policy rules-based hybrid cloud automation. But the critical management capabilities, and environment is still not there. extending that into the realm of AI. The true autonomous, lights- SDDC is needed, because digital Sout data center, where the facility becomes transformation is here. In 2016 the world a commoditized utility, may not be that far will hit 1,000 exabytes (one zettabyte) of off, and the global growth in data towards the data traffic on the internet, says Cisco’s zettabyte era will accelerate its arrival. Visual Networking Index, with internet data The role of hardware is both shriveling globally projected to grow at a CAGR of and dumbing down, with intelligence being 26 percent to 2.3ZB by 2020. increasingly software-led. But has the ability The obvious drivers for this include of software to deliver the true software- the growth in mobile/wireless devices and defined data center (SDDC) been overstated? applications; rich (including interactive) A few years back, cloud was a pipe streaming media; the Internet of Things dream. Cloud required network, storage (IoT), big data and analytics; robotics; and compute to be combined in a cohesive, drones and autonomous vehicles; 3D integrated, software-managed infrastructure. printing; VR/AR; cognitive computing and That demanded a highly abstracted much more. We believe zettabytes need (virtualized), automated, policy-based “zettastructure” – open source, software- system that combined workload management, defined, data-driven, hyperscale and agile infrastructure provisioning, failover, autonomous infrastructure: true SDDC. disaster recovery and security. And that A MarketsandMarkets study estimates package simply didn’t exist. that the global market for SDDC will bring The virtualization and cloud pioneers in $25.6bn in revenue in 2016, growing at a knew they could pool IT resources, but in CAGR of 26.6 percent to $83.2bn by 2021. So the early days they still hadn’t really given SDDC is a real market, growing really fast. a thought to creating physical data center But that only tells part of the story. infrastructure, including the guts of the These figures pull together the critical environment: power and cooling components of the IT stack only: software- (thermal management). defined networking (SDN) and, we assume, its Now the SDDC is in labor after a very long companion, network function virtualization pregnancy, promising to deliver a software- (NFV); software-defined storage (SDS); and and data-led unified toolbox that presents software-defined computing (SDC). These

30 datacenterdynamics.com • July/August 2016, Issue 15 Servers + Storage

pillars – SDN/ to keep up with, as server, storage and NFV, SDS and networking technology advanced with every SDC, combined as server refresh. Power and cooling were static software-defined for the life of the facility – at least a decade. infrastructure (SDI) – That is no longer true. make up the IT stack. The study also includes services, meaning consulting, integration and For now, the true SDDC may be limited deployment. But the study has a big omission: to those organizations with deep pockets it only counts the IT infrastructure stack, and monolithic applications – the vertically north of the rack, the logical side, and not the integrated hyperscalers and cloud services south, or physical MEP infrastructure side. providers that can push out the boundaries In our opinion, mechanical and of data center-as-a-service. But anyone electrical infrastructure (thermal and power requiring DevOps-style internet-facing agility management) systems must also become at the application and workload level will software-defined, where the software is increasingly want these characteristics from data-driven, predictively analytical, policy- its in-house or outsourced data center-as-a- based and tightly integrated into IT-stack service provider. To meet the demands placed performance management. on them, data centers must become open- The true jump to SDDC occurs source, full-stack integrated, software-defined only when various elements of and autonomous, right down to the software automation and data-driven lowest level of their infrastructure. analytical intelligence are brought None of the component to bear on the physical critical- infrastructure architectures environment infrastructure are immune to all management. this. And technology Critical environment advances won’t stop. functions have been Silicon photonics is handled under the now ready for rapid catch-all category of data $83.2bn market adoption. center infrastructure SDDC market in 2021 Blockchain, the tech management (DCIM) (MarketsandMarkets) behind Bitcoin and and, more recently, a other cryptocurrencies, higher order of function could find its way into known as data center all kinds of industries service optimization and applications such as (DCSO), which seeks to integrate FinTech, electronic voting, smart DCIM with IT service management micro-contracts and provenance (ITSM). However it is done, we need to verification for high-value art and gems. see an end to the old silos. DCD predicts blockchain will be the common methodology for managing cloud-based For years in the industrial world, IT and workload capacity-exchange contracts. operational technology (OT – the world of Meanwhile, other technologies are being control engineering) have been treated as pumped up by venture capitalists, regardless separate technical disciplines with disparate of their actual viability. DNA-strand-based cultures. Now fresh thinking and the arrival data storage doesn’t yet exist outside of of new technologies are giving IT the ability the R&D lab, but that “market” is already to automate and use OT data. valued at $1bn. Quantum computing is There are those who don’t think we need another non-existent market that already to integrate and software-define the full stack. has a five-year out valuation of $5bn. DCD disagrees. During the past decade we have learned the folly of treating the logical In the future, to accommodate rapid and physical sides of the critical-environment growth in demand and shifts in the world as different countries (if not planets). underlying platform technologies, we When design and engineering must move towards a world where cloud professionals on both sides of the border services and capacity will not require speak two different languages, this creates human touch, except where humans are threats to uptime, availability, resiliency and unwilling to let control go to software. efficient IT performance. Those risks still “Software is eating the world,” Marc exist in far too many enterprises. Andreessen famously opined in the ancient In the early days of virtualization, the pace history of 2011. That is now coming true in of change was hard for facilities engineers ways he could not have predicted.

Issue 15, July/August 2016 • datacenterdynamics.com 31 DCD Com munity Highlights: from DCD Converged Shanghai, June 16

Exhibition floor

Haifeng Qu Alibaba “Thanks to DCD for its outstanding contribution to China’s data center industry over the past seven years. We believe that DCD will bring more innovative ideas and drivers to the data center industry with its expansion into the field of large- scale infrastructure – from data center to cloud, and from IT infrastructure to digital Event networking Panel sessions business infrastructure in China.”

Training Research

Data Center Design Awareness Manifest Destiny: Huawei Girds for an International September 12-14, London Enterprise Data Center Push www.dc-professional.com/product/ www.datacenterdynamics.com/research/manifest-destiny- data-center-design-awareness huawei-girds-for-an-international-enterprise-data-center- push/96273.article

Data Center Cooling Professional September 19-21, London Google Data Center and Cloud Strategy Case Study www.dc-professional.com/ www.datacenterdynamics.com/research/google-data-center- Mention and-cloud-strategy/96194.article product/data-center-cooling- Promo Code professional ‘Magazine’ for 5% discount on these Open Compute Project (OCP) training dates Worldwide Colocation Data Center Investment Awareness Course www.datacenterdynamics.com/research/worldwide-colocation- September 22-23, London data-center-investment/96030.article www.dc-professional.com/product/open- compute-project-ocp-awareness-course Developing Solutions to the Data Center Skills Shortage Critical Operations Professional www.datacenterdynamics.com/research/developing-solutions- October 10-12, London to-the-data-center-skills-shortage/95753.article www.dc-professional.com/product/ critical-operations-professional To order, visit www.datacenterdynamics.com/research

32 datacenterdynamics.com • July/August 2016, Issue 15 Events Calendar DCD Com munity Upcoming Events: DCD Converged

London | November 1-2, 2016

Singapore | September 14-15, 2016

Dallas | September 27, 2016

Mexico | September 27-28, 2016 Mumbai | October 20, 2016

CELEBRATING 10 YEARS

All dates are confirmed for this year’s DCD Awards! ALL ENTRIES www.datacenterdynamics.com/awards OPEN NOW! Latin America Asia Pacific September 27 November 9 Club Ragga, Mexico City Hong Kong Convention & Exhibition Center

Brazil EMEA November 8 November 7 New Venue! São Paulo Hilton on Park Lane, London

Issue 15, July/August 2016 • datacenterdynamics.com 33 Viewpoint

Preparing for the worst

he UK has just voted to leave the European Union, the British pound has lost more than 10 percent of its value, the Prime Minister has resigned and the political system over here is teetering on the brink of collapse. There’s nothing but bad T news on TV, and the future is uncertain. Meanwhile, on the other side of the Atlantic, it looks increasingly likely that Donald Trump will become the next American President. We have religious fanatics rampaging across the Middle East, Russians realizing their imperial ambitions, and research showing that some cancers might be infectious. So what’s coming next – aliens, zombies, total nuclear annihilation? In times like these, it’s important to revisit your doomsday survival plan, and here we finally have a reason to be optimistic: you will struggle to find a better place to wait out the apocalypse than a data center.

Just think about it: a typical server farm – let’s talk You will struggle retail colocation here – includes diesel generators, banks of batteries and thousands of liters of fuel – more than enough to find a better to keep a small group of engineers supplied with electricity place to wait out for years to come. A reliable water source will help stave off dysentery, while the cooling equipment will filter out the apocalypse the radioactive ashes of civilization and keep the residents warm when the nuclear winter comes. than a data center Data centers have countless physical security features, including fences, mantraps, biometric locks and wall-to-wall CCTV coverage. Although these were added to the building primarily to impress potential customers, they are more than capable of stopping hordes of looters, mutants or the living dead. Massive bonus points if your data center is located in an actual military bunker – you shall inherit the earth. In this brave new world, office spaces previously reserved for hotdesking in the event of a disaster can be filled with bunk beds constructed from racks and cabinets. We’ll use Ethernet cables instead of ropes, and forge rudimentary weapons out of heatsinks. Staff kitchens will still serve their primary purpose, and squirrels – those dreadful enemies of data centers, estimated to cause up to 17 percent of outages – will finally get on the menu. There’s just one issue we won’t be able to fix: repopulating the planet. According to the Women’s Engineering Society, just nine percent of the engineering workforce in the UK is female. As an industry, we should all do our best to recruit more girls – the future of the human race could depend on it.

• Max Smolaks - News Editor @MaxSmolaxDCD

34 datacenterdynamics.com • July/August 2016, Issue 15 Cooling solutions for every scale.

All over the world, our air conditioning solutions keep sensitive data center technology cool, and make sure that information of crucial importance never overheats. Whatever the scope of your requirements, we will plan and deliver your ideal air conditioning solution. www.stulz.com

RZ_150717_STULZ_focusmagazin_UK_anzeige_270x210.indd 1 21.07.15 11:24 Market Leading Intelligent Upgradeable Rack PDU

• Independently tested to 60ºC operating temperatures • Geist Upgradeable PDUs feature a hard-wearing chassis • Hot-swappable intelligence • Remote power and environmental monitoring • Independently tested to within 1% billing grade accuracy • The most durable patented locking receptacle on the market • Available in differentiating colors

go.geistglobal.com/DCD-powerjuly geistglobal.com/upgradeablePDUgeistglobal.com/u-lock