ANALYST REPORT

National Security Agency Doing More with Less: Cooling Computers with Immersion Pays Off

Republished by GRC — 2018

Author: David Prucnal, PE

Original Publication: The Next Wave | Vol. 20 | No. 2 | 2013 YEARS

11525 Stonehollow Drive, Suite A-150, Austin, TX 78758 512.692.8003 • [email protected] • grcooling.com GRC • 512.692.8003 • [email protected] • grcooling.com Page 1 DOING MORE WITH LESS: COOLING COMPUTERS WITH IMMERSION PAYS OFF

David Prucnal, PE

A consequence of doing useful work with Immersion cooling works by directly immersing computers is the production of heat. Every watt IT equipment into a bath of cooling fluid. The of energy that goes into a computer is converted National Security Agency’s Laboratory for to a watt of heat that needs to be removed, or Physical Sciences (LPS) acquired and installed else the computer will melt, burst into flames, an oil-immersion cooling system in 2012 and or meet some other undesirable end. Most has evaluated its pros and cons. Cooling computer systems in data centers are cooled with computer equipment by using oil immersion , while some high-performance can substantially reduce cooling costs; in fact, systems use contained liquid cooling systems this method has the potential to cut in half the where cooling fluid is typically piped into a cold construction costs of future data centers. plate or some other .

Network servers are submerged into a tank of mineral oil. (Photo used with permission from Green Revolution Cooling: grcooling.com.)

GRC • 512.692.8003 • [email protected] • grcooling.com Page 1 FEATURE

The fundamentalTHE FUNDAMENTAL problem PROBLEM Operating expense: kW/ton BeforeBefore ggettingetting in intoto t hthee det detailsails of ofimmer immersionsion co ocooling,ling, Energy used to make Heat let’slet’s ttalkalk aaboutbout t hthee p rproductionoduction o fof he heatat by byco mcomputersputers and circulate cold air and the challenge of effectively moving that heat from and the challenge of effectively moving that heat from a to the atmosphere or somewhere else wherea data centhe theater to canthe abetmosp reused.here or somewhere else where the heat can be reused. Compute Power Cooling In order for computers to do useful work, they require node energy.In or derThe f oefficiencyr computers ofto thedo us workeful wthatork, they they do can Capital expense: bereq measureduire energ yas. Th thee eratiofficienc of ythe of tnumberhe work of th operationsat they do Infrastructure required can be measured as the ratio of the number of opera- to make and circulate that they perform to the amount of energy that cold air theytions consume. that they pThereerform are to quitethe a ma ofewunt metricsof energ yused that to Useful computation measurethey con sumecomputer. Ther eenergy are qui teefficiency, a few metrics but us edthe to most OPS/W basicmeasur is eoperations computer perener wattgy effi (OPS/W).ciency, bu Optimizingt the most this metricbasic is has opera beention sthe per topic watt (Oof PS/W).many PhDOptimizin thesesg tandhis Fig.FIGURE 1. There 1. Ther aree artwoe t whalveso halv esto tothe the computer computer power power efficiency will continue to be the subject of future dissertations. problem: efficiency of the actual computation (green sector) and metric has been the topic of many PhD theses and will efficiencyefficiency pr ofoblem: the cooling efficienc infrastructurey of the ac tual(blue c omputationsector). (green Over the years, there has been progress against this continue to be the subject of future dissertations. Over sector) and efficiency of the cooling infrastructure (blue sector). metric, but that progress has slowed because much of the years, there has been progress against this metric, the low-hanging fruit has been harvested and some of of energy used to expel the heat that the computer but that progress has slowed because much of the the key drivers, Moore’s Law and Denard scaling, have generatesenergy. He byre, consumington refers to energy an amo (seeunt ofiguref air co 1).ndi- low-hanging fruit has been harvested and some of the approached the limits of their benefit. Improvements tioning; hence, kW/ton has to do with the amount of key drivers, Moore's Law and Denard scaling, have ap- In fact, many traditional data centers consume as to the OPS/W metric can still be made, but they energy used to expel the heat that the computer gener- usuallyproach edcome the atlimi thets oexpensef their b eofne performance.fit. Improvements to much energy expelling heat as they do performing ates by consuming energy (see figure 1). the OPS/W metric can still be made, but they usually useful computation. This is reflected in a common The problem is not unlike miles per gallon for cars. dataIn centerfact, ma metricny tradi calledtional powerdata cen usageters co effectivenessnsume as Thecome internal at the exp combustionense of per fengineorman ceis. well understood (PUE),much ener whichgy expin itsellin simplestg heat as form they dois theperf ratioormin ofg the andTh hase p beenroblem optimized is not unlik to thee miles nth pdegree.er gallo Forn fo ar cagivenrs. power coming into a data center to the power used useful computation. This is reflected in a common engine,The int ecarrnal weight, combustio and nfrontal engine area, is w ethell under gas mileagestood is to run the computers inside. A data center with a data center metric called power usage effectiveness essentiallyand has be fixed.en opt imizedThe only to twayhe ntoth improve degree. Ftheor amiles given per PUE of 2.0 uses as much power to support cooling, gallon is to reduce the performance or exploit external (PUE), which in its simplest form is the ratio of the engine, car weight, and frontal area, the gas mileage lighting, and miscellaneous loads as it does powering benefits. In other words, drive slower, accelerate less, power coming into a data center to the power used to is essentially fixed. The only way to improve the miles the computers. Of these other loads, cooling is by far drift down hills, find a tailwind, etc. Even after doing all therun dominantthe comp ucomponent.ters inside. A So, da anotherta center way with to a improvePUE per gallon is to reduce the performance or exploit of these things, the improvement in gas mileage is only dataof 2.0 center uses as efficiency much pow eris ttoo suimprovepport co coolingoling, ligefficiency.ht- external benefits. In other words, drive slower, accel- marginal. So it is, too, with computers. Processor clock Theing, abestnd mis casecella scenarioneous loads would as itbe do esto pachieveowering a t hPUEe erate less, drift down hills, find a tailwind, etc. Even frequencies and voltages can be reduced, sleep modes ofco m1.0.pu Oneters. Ofway th estoe achieveother loads, this cowouldoling beis b yto fa buildr the a canafter be do used,ing all memory of these taccesseshings, the and imp communicationsrovement in gas datadomina centernt co minp oan enlocationt. So, a nwhereother wathey toenvironmental improve canmile beage juggledis only ma tor gamortizeinal. So i t theiris, too energy, with co costs,mput -but conditions allow for free cooling. Some commercial data center efficiency is to improve cooling efficiency. eveners. P withrocess allor ofclo this,ck fr eqtheuencies improvement and voltag ines OPS/W can be is companies have taken this approach and built data The best case scenario would be to achieve a PUE of limited.reduced , sleep modes can be used, memory accesses centers in northern latitudes with walls that can be opened1.0. One towa ylet to inac hieoutsideve th isair w otould cool be t othe bu ildcomputers a data Aan dnatural comm unicaconsequencetions can ofbe jdoinguggled usefulto amo rworktize t hwitheir whencenter the in aoutside locatio ntemperature where the en andvir ohumiditynmental arecondi- within computersenergy costs, is btheut e vproductionen with all ooff thheat.is, th eEvery impr owattve- of allowabletions allow limits. for fr eeHowever, cooling. Sforome those commer of cialus whocom parea- energyment in that OPS/W goes is intolimi tead. computer is converted into tiednies hato vthee ta kmid-en th Atlanticis approac regionh and wherebuilt da summersta centers arein a watt of heat that needs to be removed from the typically hot and humid, year-round free cooling is not computer,A natural or co elsenseq ituence will omelt,f doin burstg usef intoul w oflames,rk with or northern latitudes with walls that can be opened to let a viable option. How can data centers in this type of meetcomp usometers is otherthe p roundesirableduction of heend.at. E Anothervery watt metric,of en- in outside air to cool the computers when the outside environment improve their kW/ton and PUE? whichergy t huntilat go recentlyes into a wascom lessputer researched is converte thand int oOPS/W, a watt temperature and humidity are within allowable limits. iso fkilowatts heat that perneeds ton t o(kW/ton), be remov whiched from has th enothing compu ttoer ,do However, for those of us who are tied to the mid- HOW COMPUTERS ARE COOLED withor els thee it weightwill mel oft, btheurs tcomputer into flames, system or meet that so isme using Atlantic region where summers are typically hot and upother the undesira energy.b leHere, end .ton Ano refersther metr to anic, wamounthich un oftil air Therehumid ,are ye armany-round different free coolin waysg is no thatt a viacomputersble option. are conditioning;recently was less hence, rese kW/tonarched thashan to O doPS/W with, is the kilo amountwatts keptHow coolcan da int adata cent ecentersrs in this today; type o fhowever, environmen thet most per ton (kW/ton), which has nothing to do with the improve their kW/ton and PUE? weight of the computer system that is using up the

GRC • 512.692.8003 • [email protected] • grcooling.com Page 2

The Next Wave | Vol. 20 No. 2 | 2013 | 21 Doing more with less: Cooling computers with oil pays off commonHo methodw comput is to erscirculate are c coolooled air through the chassis of the computer. Anyone who has ever turned on a computerThere ar eknows many dithatffer enthet wa computerys that co mmakesputers noiseare ke pt cool in data centers today; however, the most common when it is powered on. Central processing units (CPUs), Conduction memory,met andhod anyis to othercircula tsolid-statee cool air th rcomponentsough the chassis are of completelythe co silent,mput erso. An whatyone makes who has the ev ernoise? turned Spinning on a com- disk drivesputer cankno wsmake that tah elittle com pnoise,uter mak butes nobyis efar, wh enthe it is dominantpow noisemakersered on. Cent ralare p rtheocessin fansg unithatts (CPUare useds), mem- to keep airo movingry, and a nacrossy other the solid-st solid-stateate com pdevicesonents a thatre co arem- all busilyplet doingely silen workt, so andwha tconverting makes the no electricalise? Spinnin powerg disk Radiation input intodriv escomputation can make a li ttandle no heat.ise, bu tEven by fa r,the the dopowermina nt supply noin isaemak computerers are thhase fa nas fanthat becauseare used t othe keep simple air act of moconvertingving across the the incomingsolid-state dealternatingvices that a rcurrente all bus- FIGURE 2. There are three modes of at work (ac) powerily do toing usable work a ndirectd conv ercurrentting elec (dc)trical power power inandput Fig. 2. There are three modes of heat transfer at work around a campfire:around aradiation, campfire: conduction,radiation, conduc and convection.tion, and convection. steppingin tthato co mpowerputatio downn and hetoa at. voltageEven the thatpower is su usablepply in by the acomputer computer hascreates a fan bheat.ecaus eAll th e ofsim theple acfanst of coinn- a computerver trequireing the inco powerming toalt erun,rnatin andg cu rrbecauseent (ac) p owtheyer hand.a ca mThispfir e:is radia becausetion, co thenduc metaltion, a nisd conductingconvection heat are not tperfectlyo usable dir efficient,ect curren t they(dc) ptooower create and st aep littleping thheatat from(see the figur firee 2). toAs yyourou si thand. around To th ea firmuche, the lesser heat th aextentt when theypow errun. do wnThe to power a voltag usede that tois us runable these by th efans com pisu t- themo airves around out laterall they fireis prima is ralsoily ra conductingdiant heat. N owheat, as- from usually ercounted creates ashe acomputert. All of the load, fans in so a itco endsmput erup r einq uirthee thesume fire y toou hayou.ve aIf met youal pplaceoker foyourr stir rhandsing th eover coals the and fire, denominator of the PUE calculation, even though it power to run, and because they are not perfectly youmo willvin gfeel logs veryon th ewarm fire. If yairou risinghold t hupe p ofromker in thethe fire. does nothing toward actual computation. efficient, they too create a little heat when they run. Thisfir eheat too lotransfer,ng it will whichstart to resultsget hot infrom your the hand heated. This air The power used to run these fans is usually counted as rising, is convection. Now, if an external source, such But how do all of these fans actually cool the is because the metal is conducting heat from the fire to computer load, so it ends up in the denominator of theas a breeze, blows across the fire in your direction, in computer? Think of cooling as heat transfer. In other your hand. To a much lesser extent the air around the PUE calculation, even though it does nothing toward addition to getting smoke in your eyes, you will feel words, when an object is cooled, heat is transferred fire is also conducting heat from the fire to you. If you actual computation. heat in the air blowing towards you. This is advection. away from that object. What do people do when they place your hands over the fire, you will feel very warm In a computer, a CPU creates heat that is typically burn their Bfinger?ut how do They all o fblow thes eon fa nit,s acandtuall if ythey coo l tarehe near air rising up from the fire. This heat transfer, which conducted through a heat spreader and then into the a sink, theycom prunuter? cold Th inkwater of co onolin it.g In as both heat tracasesnsfer they. In o arether results from the heated air rising, is convection. Now, if surrounding air. Convection causes the air to rise from actually transferring heat away from their burnt finger. an external source, such as a breeze, blows across the words, when an object is cooled, heat is transferred the heat spreader, where it is then blown, or advected, By blowing, they are using air to push heat away from fire in your direction, in addition to getting smoke in away from that object. What do people do when they away by the computer’s cooling fan. Now that we know their finger, and by running water, they are immersing your eyes, you will feel heat in the air blowing towards burn their finger? They blow on it, and if they are near how heat moves, why is it that it feels so much better their hot finger in a cool fluid that is absorbing and you. This is advection. In a computer, a CPU creates a sink, they run cold water on it. In both cases they are to dunk a burnt finger in water than to blow on it? This carrying the heat away. Anyone who has burned a heat that is typically conducted through a heat spread- actually transferring heat away from their burnt finger. is where and heat capacity of the finger knows that cold water brings much more relief er and then into the surrounding air. Convection By blowing, they are using air to push heat away from cooling fluid come into play. First, a few definitions: than hotth eirbreath. finger , Butand bwhy?y runnin Theg waanswerter, the ydepends are immer ons- causes the air to rise from the heat spreader, where it principlesing like their thermal hot fing erconductivity in a cool fluid and th aheatt is ab capacitysorbing }is t hThermal en blown, conductivity or advected, a isw athey by abilitythe com ofpu ater material’s of fluids.an Itd caalsorryin helpsg the tohe aunderstandt away. Anyo nehow w hheato has bmoves.urned a cootolin gconduct fan. heat; it is measured in watts per finger knows that cold water brings much more relief Nmeterow th adegreet we kno Celsius,w how he ora tW/(m·°C). moves, why is it that HEAT ON THE GO: RADIATION, CONDUCTION, than hot breath. But why? The answer depends on it feels so much better to dunk a burnt finger in water CONVECTION, AND ADVECTION } Heat capacity is the amount of heat required principles like thermal conductivity and heat capacity than to blow on it? This is where thermal conductivity of fluids. It also helps to understand how heat moves. to change a substance’s temperature by a Imagine a campfire on a cool evening. The heat andgiven heat ca amountpacity of torhe cotheolin amountg fluid co meof inheatto p lathaty. a from the fire can be used to keep warm and to roast First,substancecan a few definitio absorbns: for a given temperature marshmallows,Heat on but the how go: does Radia thetion, heat c onducmove fromtion, increase; Thermal itconduc is measuredtivity is th ein a bilijoulesty of aper ma tedegreerial the fire? There are three modes of heat transfer at convection, and advection Celsiusto cond (J/°C).uct heat; it is measured in watts per meter work around a campfire: radiation, conduction, and degree Celsius, or W/(m·°C). convectionImagine (see a cafigurempfir e 2).on aAs co youol ev eninsit aroundg. The he theat f rfire,om } Specific heat capacity is the heat capacity per the heatth ethat fire camovesn be us outed tolaterally keep wa isrm primarily and to ro astradiant marsh- unit Hea masst capacit or volume;y is the a mit oisu typicallynt of heat rgivenequired per unit heat. Now,mallo assumews, but youhow havedoes tah emetal heat mo pokerve fr oform tstirringhe fire? massto c handange simply a subst acallednce’s te mspecificperature bheaty a gi (Cp);ven it is the coalsTh andere a rmovinge three mo logsdes on of hetheat trafire.nsf Ifer ayout w oholdrk ar otheund measuredamount o inr t hjoulese amo upernt o fgram heat thdegreeat a subst Celsius,ance or poker in the fire too long it will start to get hot in your J/(g·°C).

22

GRC • 512.692.8003 • [email protected] • grcooling.com Page 3 FEATURE

can absorb for a given temperature increase; it is heat source, the ability of the to hold heat, and measured in joules per degree Celsius (J/°C). the temperature rise in the coolant as it flows across  Specific heat capacity is the heat capacity per the heat source. unit mass or volume; it is typically given per How much air does it take to keep a computer unit mass and simply called specific heat (Cp); it cool? There is a rule of thumb used in the data center The answer to why it feels so much better to dunk a world that 400 cubic feet per minute (CFM) of air is measured in joules per gram degree Celsius, design world that 400 cubic feet per minute (CFM) burnt ofingerr J/(g·°C). into water than to blow on it can be found is required to provide 1 ton of . One ton in table 1. First, water is a much better conductor of ooff arefrigerationir is required isto pdefinedrovide 1 tason 12,000 of refr igBritisheration. thermal heatTh thane answ air,er by to a w factorhy it f eofels 24. so Thinkmuch ofbe ittt eras t havingo dunk 24a Oneunits t opern o fhour refrig (Btu/h).eration is Given defined that as 12,0001 kilowatthour British is timesburnt morefinger bandwidth into water tforha nmoving to blow heat. on i tSecond, can be f wateround tequivalenthermal uni tots 3,412per ho Britishur (Bt u/h).thermal Giv units,en th aitt 1can kilo bewa seentt- canin ta holdble 1. farFir st,more wa terheat is athan muc hair. b eIntter fact, cond 3,200uctor timesof hothatur ais tonequi ofvalen refrigerationt to 3,412 Bwillritish cool th era malload uni ofts, 3,517 it W, more.heat th So,an awaterir, by aprovides factor o f24 24. times Think more of it asheat ha vintransferg 24 caorn approximately be seen that a t3.5on kW.of re Thefrigera masstion flow will coheatol a transferload bandwidthtimes more andband 3,200width ftimesor mo vinmoreg he heatat. S estoragecond, wa thanter oequationf 3,517 W can, or abepp rusedoxima totel confirmy 3.5 kW . theThe rulemass of fl othumb.w air. No wonder the finger feels so much better in the Air is supplied from a computer room air conditioning can hold far more heat than air. In fact, 3,200 times heat transfer equation can be used to confirm the rule water. o(CRAC)f thumb unit. Air inis sua pptypicallied f rodatam a cocentermpu terat raboutoom a ir18°C more. So, water provides 24 times more heat transfer (64°F). Now, 400 CFM of air at 18°C is equivalent to conditioning (CRAC) unit in a typical data center at Oneband morewidth thingand 3,200 to consider times mo aboutre he aheatt sto rtransfer;age than heat air. 228 grams per second, and the specific heat of air is naturallyNo wonder flows the fi fromnger fhoteels to so cold,muc hand bett theer in rate the waof theater. aequivalentbout 18°C to(64°F). 1 J/(g·°C). Now, 400 Solving CFM theof a irmass at 18°C flow is heat transfer is proportional to the temperature difference. eqtransferuivalen equationt to 228 gra abovems p wither se cothisnd information, and the sp eciyieldsfic a One more thing to consider about heat transfer; This is why the colder the water, the better that burnt hechangeat of a inir istemperature equivalent tofo 1 15°C. J/(g·°C). What So alllvin thisg t hconfirmse mass fingerheat na isturall goingy fl otows feel. from hot to cold, and the rate of fl(inow Fahrenheit) heat transf eris thatequa whention ab 64°Fove coolingwith this air inf iso suppliedrma- heat transfer is proportional to the temperature differ- tioat an rateyields of a 400 cha nCFMge in per tem 3.5pera kWtu rofe computerof 15°C. W load,hat all the ence. This is whCOOLINGy the colder COMPUTERS the water, the better that texhausthis confi airrm sfrom (in Ftheahr enheicomputerst) is th aist w91°F.hen Anyone64°F coo whol- Bybu rnownt fi itn gshoulder is go bein gapparent to feel. that the fans in a computer inhasg astoodir is su inpp thelied “hotat a ra aisle”te of 400directly CFM behind per 3.5 a kW rack of of are there to advect (i.e., move) a cooling fluid (e.g., air) coserversmput erwill load know, th ethat exha thisust ruleair f rofom thumb the co ism confirmed.puters is acrossCooling the heat comput producingers parts (e.g., CPUs, memories, 91°FIt is .not An yunusualone wh ofor has a stserverood in rack the to“ho consumet aisle” dir overectly 10 and peripheral component interconnect cards) so that bkW.ehind Using a rac thek o rulef ser vofer sthumb will kno above,w tha ta t10.5his r ukWle o serverf theBy nocoolingw it sho fluiduld bcane ap paabsorbrent thheatat t hthroughe fans in conduction a com- trackhum requiresb is confi 1,200rmed .cubic feet of cooling air— enough andputer carry are th erit eaway. to ad veThisct (i .ecan., mo beve) describeda cooling fluidby the air to fill a 150 square foot office space with an 8 foot following mass flow heat transfer equation: Q = mc T It is not unusual for a server rack to consume over (e.g., air) across the heat producing parts (e.g., CPUps,Δ ceiling—per minute. That’s a whole lot of air! Simply 10 kW. Using the rule of thumb above, a 10.5 kW Inmemo this requation,ies, and p Qer iisp htheeral rate com ofpo heatnent transferintercon in- watts, moving all of that air requires a significant amount server rack requires 1,200 cubic feet of cooling air— mnec ist catherd masss) so t hflowat th erate coo linof gthe fluid cooling can a bfluidsorb heina tgrams of energy. In fact, for racks of typical one-unit (1U) perthro second,ugh cond cuc istio then a specificnd carry iheatt aw aofy. theThis coolingcan be de-fluid, enoservers,ugh airthe to fienergyll a 150 requiredsquare foo tot o ffimovece space cooling with air p from the CRAC units and through the servers is on andscrib ΔedT bisy tthehe fchangeollowin gin mass temperature flow heat traof nthesfer cooling equa- an 8 foot ceiling—per minute. That’s a whole lot of fluid. What it says is that the cooling depends on atheir! Sorderimply ofmo 15%ving ofall theof t htotalat air energyrequires consumed a significan byt tion: = cp∆T production of the amount of coolant flowing over the athemo computers.unt of energ yRemember—this. In fact, for racks is o justf typ theical energyone-uni tot heatIn source, this equa thetio abilityn, is of th thee ra coolantte of hea tot tra holdnsf erheat, in and (1U)move s etherver coolings, the ener air,g yit rdoesequir ednot to include move co theolin energyg required to make the cold air.If there was a way to thewat ts,temperature is the mass rise fl oinw therate coolant of the co asolin it gflows fluid inacross air from the CRAC units and through the servers is cool computers without moving exorbitant quantities thegra mheats per source. second, cp is the specific heat of the cooling on the order of 15% of the total energy consumed by fluid, and ∆T is the change in temperature of the cool- of air, it could reduce energy consumption by up to t15%.he co Thismp umayters. notRemem seember like—t hismuch, is ju stbut th econsider energy tthato Howing fl uidmuch. W airhat doesit says it istake tha tto th ekeep coo lina computerg depends cool?on There is a rule of thumb used in the data center design moa 15%ve thimprovemente cooling air, iint doOPS/Wes not incis almostlude th eunheard energy of, production of the amount of coolant flowing over the required to make the cold air.

TABLE 1. Thermal conductivity and heat capacity of common substances

Thermal Conductivity, Specific Heat (Cp), Volumetric Heat Capacity (Cv), W/(m·°C) at 25°C J/(g·°C) J/(cm3·°C) Air 0.024 1 0.001297 Water 0.58 4.20 4.20 Mineral Oil 0.138 1.67 1.34 Aluminum 205 0.91 2.42 401 0.39 3.45

Table 1. Thermal conductivity and heat capacity of common substances

The Next Wave | Vol. 20 No. 2 | 2013 | 23

GRC • 512.692.8003 • [email protected] • grcooling.com Page 4 and for a moderate 10 megawatt (MW) data center, a is tipped over onto its face. Now convert the rack into 15% reduction in energy consumption translates into a a tub full of servers. Now fill the tub with mineral oil. savings of $1.5 million per year. Figures 3 and 4 show the system that LPS acquired PUMPING OIL VERSUS BLOWING AIR and is using in its Research Park facility. The system is comprised of a tank filled with mineral oil that holds Unfortunately, we cannot dunk a computer in water the servers and a pump module that contains an oil- like a burnt finger since electricity and water do not to-water heat exchanger and oil circulation pump. play well together. Mineral oil, on the other hand, has In this installation, the heat exchanger is tied to the been used by electric utilities to cool electrical power facility’s chilled water loop; however, this is not a distribution equipment, such as transformers and necessity, as will be discussed later. The oil is circulated circuit breakers, for over 100 years. Mineral oil only has between the tank and the heat exchanger by a small about 40% of the heat holding capacity and about pump. The pump speed is modulated to maintain a one quarter the thermal conductivity of water, but it constant temperature in the tank. This matches the has one huge advantage over water—it is an electrical cooling fluid supply directly to the load. The design insulator. This means that electrical devices can of the tank interior is such that the cool oil coming operate while submerged in oil without shorting out. from the heat exchanger is directed so that most of it must pass through the servers before returning to While mineral oil does not have the heat capacity of the heat exchanger. The combination of pump speed water, it still holds over 1,000 times more heat than modulation and oil ducting means that the cooling air. This means that the server rack discussed earlier fluid is used very efficiently. The system only pumps that needed 1,200 CFM of air to keep from burning up the amount needed to satisfy the load, and almost all could be kept cool with just about 1 CFM of oil. The of what is pumped passes through the load. energy required to pump 1 CFM of oil is dramatically less than the energy required to blow 1,200 CFM of air. There are three interesting side benefits to immersion In a perfectly designed data center, where the amount cooling in addition to its efficiency. The first is due of air blown or oil pumped is matched exactly to the to the fact that the system is designed to maintain heat load, the energy required to blow air is five times a constant temperature inside the tank. Because that required to pump oil for the same amount of heat the pump is modulated to maintain a set point removed. In reality, the amount of air moved through temperature regardless of changes in server workload, a data center is far more than that required to satisfy the servers live in an isothermal environment. One the load. This is due to the fact that not all of the air of the causes of circuit board failures is due to the blown into a data center passes through a computer mismatch in the coefficients of thermal expansion, or before it returns to the CRAC unit. Since the air is not CTEs. The CTEs for the silicon, metal, solder, plastic, ducted directly to the computers’ air intakes, it is free and fiberglass used in a circuit board are all different, to find its own path back to the CRAC unit, which which means that these materials expand and contract is frequently over, around, or otherwise not through at different rates in response to temperature changes. a server rack. As we will soon see, it is much easier In an environment where the temperature is changing to direct the path of oil and to pump just the right frequently due to load changes, this difference in CTEs amount of oil to satisfy a given computer heat load. can eventually lead to mechanical failures on thecircuit Thus, the energy required to circulate oil can be more board. Oil immersion reduces this problem bycreating than 10 times less than the energy required to circulate a temperature-stable environment. air. The second side benefit is server cleanliness. Aircooled IMMERSION COOLING SYSTEM servers are essentially data center air cleaners. While data centers are relatively clean environments, there is Now that we have established that mineral oil would still some dust and dirt present. Remember, a typical be a far more efficient fluid to use for removing heat server rack is drawing in a large office space full of from computers, let’s look at how a system could be air every minute. Any dust or dirt in that air tends to built to take advantage of this fact. accumulate in the chassis of the servers. Pumping oil versus blowing air tends to accumulate in the chassis Imagine a rack of servers. Now imagine that the rack of the servers.

GRC • 512.692.8003 • [email protected] • grcooling.com Page 5 Fig. 4. Network servers are submerged into a tank of mineral oil and hooked up to a pump that circulates the oil. (Photo used with permission from Green Revolution Cooling: grcooling.com.)

Air cooling infrastructure

Fig. 3. The immersion cooling system at the Laboratory for Physical Cooling air is typically supplied in a computer room Sciences, like the one pictured above, uses mineral oil to cool IT with CRAC units. CRAC units sit on the computer equipment. (Photo used with permission from Green Revolution Cooling: grcooling.com.) room raised floor and blow cold air into the under- floor plenum. This cold air then enters the computer room through perforated floor tiles that are placed in The final side benefit of immersion cooling is silence. front of racks of computers. Warm exhaust air from Immersion cooling systems make virtually no noise. the computers then travels back to the top of the This is not an insignificant benefit, as many modern CRAC units where it is drawn in, cooled, and blown air-cooled data centers operate near or above the back under the floor. In order to cool the air, CRAC Occupational Safety and Health Administration’s units typically use a chilled-water coil, which means allowable limits for hearing protection. that the computer room needs a source of chilled water. The chilled water (usually 45–55°F) is supplied In addition to efficient use of cooling fluid and the by the data center chiller plant. Finally, the computer side benefits mentioned above, there is another room heat is exhausted to the atmosphere outside advantage to immersion cooling—server density. As usually via evaporative cooling towers. mentioned earlier, a typical air-cooled server rack consumes about 10 kW. In some carefully engineered Oil-immersion systems also need to expel heat, and HPC racks, 15–20 kW of load can be cooled with air. one way is through the use of an oil-to-water heat In comparison, the standard off-the-shelf immersion exchanger; this means that oil-immersion systems, cooling system shown in figure 3 is rated to hold 30 like CRAC units, need a source of cooling water. kW of server load with no special engineering or The big difference however is that CRAC units need operating considerations. 45–55°F water; whereas, oil-immersion systems can operate with cooling water as warm as 85°F. Cooling DOING MORE WITH LESS towers alone, even in August in the mid-Atlantic area, can supply 85°F water without using power-hungry Let’s take a look at how immersion cooling can enable chillers. Because oil-immersion systems can function more computation using less energy and infrastructure. with warm cooling water, they can take advantage infrastructure. of various passive heat sinks, including ,

GRC • 512.692.8003 • [email protected] • grcooling.com Page 6 Doing more with less: Cooling computers with oil pays off

Oil-immersion systems also need to expel heat, compares the power required to move 1 W of exhaust and one way is through the use of an oil-to-water heat heat into a data center’s chilled water loop for fan- exchanger; this means that oil-immersion systems, like blown air cooling versus pump-driven oil-immersion CRAC units, need a source of cooling water. The big cooling. The third column shows this power as a difference however is that CRAC units need 45–55°F percentage of IT technical load. It shows that the water; whereas, oil-immersion systems can operate power required to run all fans in an air-cooled sys- geothermal wells, or nearby bodies of water. 10.5%, represents the net fan-power savings achieved with cooling water as warm as 85°F. Cooling towers tem is equal to 13% of the technical load that is being by switching from an air-cooled to immersion-cooled aloThen etakeaway, even in Ahereugust is thatin th theree mid-A is at lsignificantantic area, caamountn datacooled center.. This isThis con tanalysisrasted wi assumesth the pow thater r einq uirbothed the suppof expensive,ly 85°F wa tenergy-hungryer without usin ginfrastructure power-hungr yrequired chill- air-cooledto run pum andps in immersion-cooled an oil-immersion cocases,oling thesyst em,cooling ertos. make Beca usande o distributeil-immersio coldn syst aire mtos keepcan f ucomputersnction wit hin infrastructurewhich is equal isto matched 2.5% of t hexactlye technical to the load load. tha Thet is last waa datarm co centeroling wacool.ter, Muchthey ca ofn tthisake infrastructureadvantage of va isr inot- columnbeing co inol edtable. Th 2e diusesffer encea similar, 10.5%, analysis repres butents assumes the orequiredus passi vfore he immersionat sinks, inc cooling.luding radiators, geothermal thatnet fa then-p coolingower sa vininfrastructuregs achieved capacity by switc hinis provisionedg from wells, or nearby bodies of water. atan twiceair-co otheled load.to immer It showssion-co thatoled overprovisioned data center. This fan Fan power poweranalysis grows assumes faster th athant in boverprovisionedoth the air-cool edpump and power. The takeaway here is that there is a significant One of the primary benefits of immersion cooling is the Thisimmer is siofurthern-co oillustratedled cases, inth efigure coolin 5.g infrastructure is amount of expensive, energy-hungry infrastructure removal of cooling fans from the data center. Not only matched exactly to the load. The last column in table required to make and distribute cold air to keep com- Lower operating expenses are the energy savings that result from the removal 2 uses a similar analysis but assumes that the cooling puters in a data center cool. Much of this infrastruc- of cooling fans significant, they are compounded by Tableinfrastr 3 ucomparescture capaci thety fanis p rpowerovisio nedversus at twice pump th epower load. tpotentiallyure is not r removingequired fo ther immer necessitysion co foro linCRACg. units and required to serve a 1 MW technical load, assuming the chillers. It shows that overprovisioned fan power grows faster coolingthan ov erinfrastructureprovisioned p uism sizedp pow toer . serveThis is 150% further of il-the load. It shows that the fan power to circulate cold air FCoolingan po wfanser in a typical 1U rack-mounted server lustrated in figure 5. consume roughly 10% of the power used by the exceeds the pump power to circulate oil by 158 kW Oneserver. of tServershe prima thatry b earenefi tscooled of immer in ansio noil-immersion cooling is per megawatt of technical load. At one million dollars tsystemhe remo doval notof co requireoling fa coolingns from tfans.he da Thista cen factter. Naloneot perLow megawatt-year,er operating thisexpenses equates to $158,000 a year omeansnly ar ethat the immersionenergy savin coolinggs that requiresresult fr oapproximatelym the remov- in additional cooling energy operating expense. This Table 3 compares the fan power versus pump power al10% of lesscoolin energyg fans thansigni fiairca ncooling.t, they aInternalre comp serverounded fans, represents the savings due solely to circulating cooling fluid.requir edWhen to s etherve costa 1 MW of makingtechnical cold load air, isassumin considered,g bhowever,y potent iallarey notremo thevin onlyg th efans necessi requiredty for forCRA air-cooledC units thethe coenergyoling infsavingsrastru coftur eimmersion is sized to scoolingerve 150% becomes of the acomputers.nd chillers. CRAC unit fans are also necessary in order to distribute cold air throughout the data center and muchload. I tmore shows significant. that the fan power to circulate cold air presentCoolin itg to fa thens in inlet a typ sideical of 1U the rac serverk-mo uracks.nted server Tableexceeds 4 summarizesthe pump pow theer energy to circ urequiredlate oil b fory 158 aircooling kW consume roughly 10% of the power used by the server. per megawatt of technical load. At one million dollars This CRAC unit fan power must be considered when that is not needed for immersion cooling. The values in Servers that are cooled in an oil-immersion system do per megawatt-year, this equates to $158,000 a year determining the actual fan-power savings that can Table 4 are typical for reasonably efficient data centers. not require cooling fans. This fact alone means that be realized by immersion cooling systems. Table 2 Onein addi tontio ofn alrefrigeration cooling ener willgy coolopera approximatelyting expense. Th 3,500is immercomparession the coo powerling re qrequireduires app toro moveximat el1 Wy 10% of exhaust less Wrep resof entechnicalts the sa vinload;gs dtherefore,ue solely t o1 cirMWcula oftin gtechnical cooling enerheatg intoy th a ndata air cocenter’soling. chilledInternal water serv erloop fan fors, ho fanblownwever, loadfluid .requires When t hae minimumcost of makin of 285g co tonsld air of is refrigeration.considered, aairre coolingnot the oversusnly fa npump-drivens required fo roil-immersion air-cooled co mcooling.put- Atthe 2.1ener kW/ton,gy savin gsthe of immerair-cooledsion codataolin centerg becomes cooling infrastructure consumes about 600 kW to cool 1 MW erThes. CRAthirdC column unit fan showss are als thiso necess powerary as in ao rpercentageder to dis- much more significant. worth of technical load. This equates to $600,000 tribof ITu ttechnicale cold air tload.hroug Itho showsut the that data the cen powerter and required present perT ayearble 4 per summa megawattrizes th eof ener technicalgy req uirload.ed foAlmostr air all itot t orun the allinlet fans side in o fan th eair-cooled server rac ks.system is equal to 13% of the technical load that is being cooled. This is ofco othisling energythat is nocostt needed can be fo reliminated immersion by co oimmersionling. The contrastedThis CRA withC uni thet fa powern pow requireder must b toe corunnsider pumpsed in an coolingvalues in since Table chillers, 4 are typ CRACical f ounits,r reas andona bserverly efficien fanst are woil-immersionhen determinin coolingg the system,actual fa whichn-pow iser equal savin togs 2.5%that of notdata required. centers. cathen technicalbe realized load by immerthat is siobeingn co cooled.oling syst Theem difference,s. Table 2

TABLE 2. Power usage for air-cooled versus immersion-cooled data centers Method of Cooling Power Required to Move 1W Percentage of Technical Percentage of Technical of into Chilled Load to Power Fan or Pump Load to Power Fan or Water Loop (W) (at 100%) Pump (at 200%) Fan-Powered Air 0.13 W 13% 26% Pump-Powered Oil Immersion 0.025 W 2.5% 5% Net savings due to fan removal 10.5% 21%

Table 2. Power usage for air-cooled versus immersion-cooled data centers

26

GRC • 512.692.8003 • [email protected] • grcooling.com Page 7 FEATURE

TABLE 4. Power usage of cooling equipment in air-cooled 30% data centers Fan-Powered Air FEATURE Pump-Powered Equipment Power Usage (kW/ton)FEATURE 25% Chillers 0.7 kW/ton

20% CRATABLEC Units 4. Power usage of cooling1.1 equipment kW/ton in air-cooled 30% data centers Fan-Powered Air TABLESe r4v. erPo Fwanser usage of cooling equipment0.2 kW/t in air-on cooled 30% 15% Pump-Powered Oil data centersCooling Equipment Power Usage (kW/ton) 25%Fan-Powered Air Power savings Total 2.1 kW/ton due to server Cooling Equipment Power Usage (kW/ton) Power Usage Pump-Powered Oil Chillers 0.7 kW/ton 10%25% fan removal 20% Chillers CRAC Units 0.7 kW/ton Cooling infrastructure acco1.1un kW/tts fonor a major portion 20% CRAC UnitsServ er Fans 1.1 kW/t0.2on kW/ton 5% 15% of data center construction costs. In high reliability/ Power savings Server FTansotal 0.2 kW/t2.1on kW/ton 15% due to server availability data centers, it is not uncommon for the 0% Power Usage Power savings Total 2.1 kW/ton 10% due to serfanver removal cooling infrastructure to account for half of the overall Power Usage 100% 120% 140% 160% 180% 200% Table C4.oo Powerling inf usagerastr uofc turcoolinge acco equipmentunts for a main air-cooledjor portio ndata 10% fan removal cocentersnstruction cost. According to the American Power 5% Installed Fan/Pump Capacity CCoooonlinfv daerg inftsioa cenrastrn Dteruat ccoatur Cnese trnaccouctertio unCnats pcosts. iftoalr aC Imano sthigjo Crh ap lcroerulaliatiobntilio rt,y/ cool- 5% of dataa cenvailaterbi colityn sdatrtuca centiont ecosts.rs, it isIn no higt uncoh reliammobilitny/ for the FIGURE 5. ing infrastructure accounts for at least 43% of data The0% power required to run the fans in an air-cooled avaitemperaturelabcoilityolin dagt ainf cenrastr andterus, chumidity ittur ise noto tacco unco controls,unmmot forn half fetc.,or othf theyeth e o voffererall a data center (purple100% line) 120%accounts 140%for about 160%13% of the 180% center’s 200% 0% coolincensubstantialgcot inferns rastrcotrucnstiou trreductioncuctnur cost.tioe ton Aaccocost.cco inun rcapitaldint fogr t ohalf texpenditureshe o Amerf the oicaveralln P owover er air- technical100% load (26% 120% if run at 140% twicInstallede the Fan/Pump 160% technical Capacity 180% load); wher 200%eas, conscooledtrucCFootiorn vla nsystems.er rcost.gsioe nda ADtccoaa tcenar Cdinetengrts, erto w Cthaeper iAmerteal t hCeoica stte ncChnical aPlcowulaert oloadr, co oisl -in the power required to runInstalled the Fan/Pump pumps Capacityin an immersion-cooled at least 43% Figuredata c e5.nt FIGURETheer (g powerreen 5. line)T herequired po acwcerounts torequir run fored the about to fans run 2.5% thein an fans of air-cooled thein an c eair-nt ercdataooled’s C o ntvherein sioneigg ninf Dhrastrbatoarho Cucetnourtdere o accoCf a60piun tMWalts C fo,o stcor Cnaslctrulauctiotor,n co costsoof lda- t aca n at least 43% centertechnicalFIGURE (purpledata 5load. T hec line) e(5% ntpoer w accountsif (purpleer run requir at line)t edforwic t aboutoace runthecounts the t13%echnical fansfor of about thein anload). center’s 13%air- cA ooledofs technicalisthe illus- centiner ’gs ainfpprastrcenroacteruhc cot ourneseIMMERSION tracco bucilliotiounnts docost. follar COOLINGrs. This me FAQSoafn das tthaa t about loadtratdataed (26% ,c oentvtechnical erprerif run(purpleovisioned at load twice line) (26% fanac thec ountsifpo runtechnicalw er atfor gt wicrabouto wload);e sthe fast 13% t echnicalerwhereas, of than the coload);e vthenterprer power’wherso vi-eascen , ter construction cost. required tothe run po thewer pumps requir edin an to immersion-cooledrun the pumps in an data immersion- center (greencooled 500Several millioFo r recurringlanrg doe dallatar scen questionsis tbeeinrs, wg hsper enhavee tth oe ntemergede cochnicaloling load inf overra- is inthe line)sionedtechnical accounts pump load pofor (26%w abouter if. run 2.5% at twic of ethe the center’s technical technical load); wher loadeas (5%, if the powdataer requir centeder (gtor eenrun theline) pumps accounts in an for immersion- about 2.5%c ooledof the center’sF strmanyor utlahcretg urneigtourse dae pthaerb andceno darhott edemonstrationsaor s,dcen wofh t60eerr. eMW Sthincee ,t coe cimmerhnicaln ofstr ucthe tioloadsio LPSnn-co costs is immersionino lcaedn run at twicetechnical the technical load (5% load).if run at As twic is eillustrated, the technical overprovisioned load). As is illus- fandata power cent growser (green faster line) than accounts overprovisioned for about 2.5% pump of the power. center’s the systcoolingneigappehmbrosoacrho dosystem.ho noodn ote f r b 60eHereillioq uirMWne doare ,c cohlla illernanswersrss.tr Ths,uc CRAtiois men to costsCa n theseunis t hcats,ant abfrequentlyraoisuedt technicaltrat loaded, (5%overpr if runovisioned at twic efan the po technicalwer grow load).s fast erAs thanis illus- overproavi-pproac500h o millione billion don dollarllas isrs. b Theinisg mespenants o tnh acot abolinougt infra- sioned pump power. flaskedoorin questions.g, and temperature and humidity controls, etc., tratOneed, otvoerprn oofvisioned refrigera fantio powner will gro wcos fastol aerpp thanroxima overprtoelvi-y 500 milliostrunc tdourllae prers is da bteina ceng spteenr. St ioncen co immeroling sioinfra-n-cooled sioned pump power. they offer a substantial reduction in capital expendi- Lower3,500 W capital of tec hnicalexpenses load; therefore, 1 MW of techni- Q. syst Whatems do server not re quir modificationse chillers, CRAC uniarets, rarequiredised for strutcuturrese poerver da atiar -cocenoteledr. S isystncee immerms. sion-cooled cal load requires a minimum of 285 tons of refrigera- systimmersion?emfls odoor innog,t raenqduir teem cpheraillertus,r eCRA andC h umidiunits, tyra iscoednt rols, etc., ImmersionOne cooling ton o frequires refrigera fartion less will infrastructurecool approxima thantely floorintg,h eayn odff terem ap substeratuaren taialnd r hedumidiuctiotyn inco cantproitls,al etexpc.,endi- airtio n.cooling;One At3,500 2.1ton therefore,kW/to Wf r eofrfo igtn,eerac hnicalth tiobuildinge ani r willload;-co coo data ltedohler adaefpp centersotraroe ximacen, 1 MWt ertdedicatedel coy o fo tlinechni-g There are three modifications that are typically theyI mmersionofftuerr esa substover anirt-co ialcooling oreleddu csysttione Fm inAs .caQspital expendi- toinf3,500 immersionrastr Wcaluc ot loadurf te ccooling co hnicalrenquirsumes esload; is a substantially minima tbhoerueft umo600re ,o 1kWf 285MW less t ot o o expensive.conf st oeoclf hni- 1re MWfrigera- required including: tures over air-cooled systems. wocalrt loadh otiof rten.qc uirhnicalAt es2.1 a kW/t minimloado. n,Thum tish oe eq fa 285iuar-cot tesonl edtso o $600,000daf rteafr cenigera-ter p coero ling Cooling infrastructure accounts for a major portion S everal1. Removing recurrin gthe questio coolingns ha fans.ve emer Sinceged someover t hpowere yetioarn. p erAinft mega2.1rastr kW/twauctttourn, oe f tco htenc ashnicaliumesr-coo al edloadbo daut. t 600Almosta cen kWter tcoallo coo olinof ltg 1h isMW Immersion cooling FAQs of data center construction costs. In high reliability/ manysupplies tours an dwill demo shutnstra downtion supon of th eloss LPS ofimmer cooling,asion enerinfrastrgy costwoucrttur hca eon fco tbenecs hnicalumeselimina aloadbtoedu.t Th600byis immer kWequa ttoes sioco ton l$600,000 1co MWol- perImmersion cooling FAQs availability data centers, it is not uncommon for the cooSlinevsmalleralg syst re cem.emulatorurr iHnge rqeuestio aisre ainstallednnssw haerves emert oto th gestrickede ofvr eqerthe uenth epower tly inwog sincerth oyef cather cillerphnicaler megas, CRA loadwa.Ctt Th ounifis t eeqts,chnicalua antesd s tloadoe r$600,000ver. Almost fans pa erralle of this cooling infrastructure to account for half of the overallS everalaskmae rdecsupplyn quyuestiorr toinurgs qinto nauestions.d thinkingdemons hanstrave theemertio nfansg oef dis t oh stillveer LPS tthere.he immer sion energy cost can be eliminated by immersion cool- constructionnoyet arreq peruir megaed. cost.watt oAccordingf technical loadto the. Almost American all of t hPoweris ma ny tcoourolins agn dsyst demoem.n Hstraeretio arnes a onfsw there sLPS to timmerhese frsioequenn tly ing since chillers, CRAC units, and server fans are Q. What server modifications are required Conversionenergy cost caDatan be eliminaCenterted Capitalby immer Costsion coCalculator,ol- co olin gask 2.syst eSealingdem. questio He rtheen s.ar harde answ drives.ers to t hThisese fsteprequen is tnotly required not required. coolinging since infrastructure chillers, CRAC accounts units, and forserv erat fa leastns ar e 43% ofask efdor q immersion?uestioforn solid-states. drives or for newer sealed helium- Lonowt rereq uircapitaled. expenses Q. Wfilledhat ser drives.ver modifications are required data center construction cost. There are three modifications that are typically Lower capital expenses Q. Whaftor ser immersion?ver modifications are required Immersion cooling requires far less infrastructure for immersion?required including: For large data centers, where the technical load is in 3.Th Replacingere are thr ethee mo thermaldificatio interfacens that are pastetypicall betweeny thLaonw aerir co capitaloling; t expensesherefore, building data centers chips and heat spreaders with indium foil. Some the neighborhoodImmersion co ofo lin60g rMW,equir esconstruction far less infrastr costsuctur cane Therr1.e q aRemo uirr e tedhr eincvine moluding tdihefig: caco tioolinns gth faatn as.re S typinceicall soyme power approachdedicatedtha onetno a immerir billion coolinsio dollars.g;n th coeroef linThisoreg, isbmeansu substilding athat danttialla about ceny ter 500s server vendors are already looking at providing Immersion cooling requires far less infrastructure required incsuppludinliesg: will shut down upon loss of cooling, millionlesstha expn a dollarsirdedicaen cosiovlinet. edisg; beingtoh erimmerefo spentre,sio bunildin oncoo coolingglin dag tisa censubst infrastructureterasn tially 1.immersion-ready Remo ving the coo serversling fans. which Since swillome bepow shippeder perdedica datatlessed center. texpo immeren siSincevesio. n immersion-cooledcooling is substantiall systemsy do 1. Remo withvinsuppg thesetlieshe co will omodificationslin shgu fat dons.wn Since up osalreadyno meloss p oowf coermade.o ling, notless require expensi vchillers,e. CRAC units, raised flooring, and supplies will shut down upon loss of cooling, TABLE 3. Power usage for air-cooled versus immersion-cooled data centers with 1 MW of technical load TABLE 3. Power usage for air-cooled versus immersion-cooled data centers with 1 MW of technical load MethodTABLE 3of. P Cooolingwer usage for air-cooled versusFan immersion- or Pumpc Pooledower da asta a cPeentrcersentage with 1 of MW ofTotal technical Power load (at 150% capacity) Method of Cooling TechnicalFan or L oadPump (a Pt o150%wer as capacit a Percey)ntage of Total Power (at 150% capacity) Method of Cooling Fan or PTumpechnical Pow erLoad as a(a Pte 150%rcentage capacit of y) Total Power (at 150% capacity) Fan-Powered Air 19.5% 1.195 MW Fan-Powered Air Technical19.5% Load (at 150% capacity) 1.195 MW Pump-Powered Oil Immersion 3.75% 1.0375 MW Fan-PowPerumped -PAirowered Oil Immersion 19.5% 3.75% 1.195 MW1.0375 MW Delta 158 kW Pump-PDoeltawered Oil Immersion 3.75% 1.0375 MW158 kW TableDelta 3. Power usage for air-cooled versus immersion-cooled data centers with 1 MW158 of kW technical load

TThe Nextt W Waavvee | |V Volo. l20. 20 No No. 2. 2 | 2013 | 2013 | 27 | 27 The Next Wave | Vol. 20 No. 2 | 2013 | 27

GRC • 512.692.8003 • [email protected] • grcooling.com Page 8 Q. Are there hazards associated with the oil? (e.g., liquid to gas allows for higher heat removal but adds to fire, health, spillage) the complexity of the system. Also, the liquid used in two-phase systems is extremely expensive compared With regard to flammability, the mineral oil is a Class to mineral oil. At this time, there are no two-phase IIIB liquid with a flammability rating of 1 on a scale of 4. immersion-cooling systems commercially available. Accordingly, immersion cooling does not require any supplemental fire suppression systems beyond what is CONCLUSION normally used in a data center. The health effects are negligible. The oil is essentially the same as baby oil. Computers consume energy and produce computation and heat. In many data centers, the energy required to Spills and leaks are considered a low probability; remove the heat produced by the computers can be however, for large installations, some form of spill nearly the same as the energy consumed performing containment is recommended. Spill decks, berms, useful computation. Energy efficiency in the data curbs, or some other form of perimeter containment center can therefore be improved either by making is sufficient. computation more energy efficient or by making heat removal more efficient. Q: How much does the system weigh? Immersion cooling is one way to dramatically improve A 42U tank fully loaded with servers and oil weighs the energy efficiency of the heat removal process. The about 3,300 pounds, of which the oil accounts for operating energy required for immersion cooling can about 1,700 pounds. This weight is spread over a be over 15% less than that of air cooling. Immersion footprint of approximately 13 square feet for a floor cooling can eliminate the need for infrastructure that loading of approximately 250 pounds per square foot. can account for half of the construction cost of a data A comparably loaded air-cooled server rack weighs center. In addition, immersion cooling can reduce about 1,600 pounds with a footprint of 6 square feet, server failures and is cleaner and quieter than air which also translates to a floor loading of about 250 cooling. pounds per square foot. Immersion cooling can enable more computation Q: How is the equipment serviced or repaired? using less energy and infrastructure, and in these Basic services such as device and board-level times of fiscal uncertainty, the path to success is all replacements are not significantly different than for about finding ways to do more with less. air-cooled equipment. Hot-swaps can be done in the oil. For services requiring internal access, the server ABOUT THE AUTHOR can be lifted out of the tank and placed on drainage David Prucnal has been active as a Professional rails above the surface of the oil. After the oil drains, Engineer in the field of power engineering for over component replacement is carried out the same way 25 years. Prior to joining NSA, he was involved with as for air-cooled servers. designing, building, and optimizing high-reliability For rework at the circuit board level that requires data centers. He joined the Agency as a power removal of the oil, there are simple methods available systems engineer 10 years ago and was one of the first to ultrasonically remove oil from circuit boards and to recognize the power, space, and cooling problem in components. high-performance computing (HPC). He moved from facilities engineering to research to pursue solutions Q: Are there other types of immersion-cooling to the HPC power problem from the demand side systems besides oil immersion? versus the infrastructure supply side. Prucnal leads the energy efficiency thrust within the Agency’s Advanced Yes. What this article has covered is called singlephase Computing Research team at the Laboratory for immersion. That is, the oil remains in the liquid phase Physical Sciences. His current work includes power- throughout the cooling cycle. There are some people aware data center operation and immersion cooling. looking into two-phase immersion-cooling systems. In He also oversees projects investigating single/few a two-phase cooling process, the cooling liquid is boiled electron transistors, three-dimensional chip packaging, off. The resulting vapor is captured and condensed lowpower electrical and optical interconnects, and before being recirculated. The phase change from power efficiency through enhanced data locality.

To Learn More About GRC and Our Liquid Immersion Cooling for Data Centers

Call 512.692.8003 or Visit grcooling.com Page 9