< Browse more articles

hrai3a2One of the most advanced astrophysical research facilities in the world is located in Kaleden, BC. The Canadian Hydrogen Intensity Mapping Experiment (CHIME) radio telescope maps the largest volume of space ever surveyed in history. It studies the expansion of the universe. It looks like several snowboard half-pipes and covers an area the size of five hockey rinks. At seven quadrillion computer operations per second, its volume is equivalent to data flow through all of the world's mobile networks.

CHIME opened in 2017 after five years development. When they started in 2012 there was no computer system available that could process a terabyte of information per second, but video game progress provided an answer: 1024 high-speed processors working simultaneously. The next challenge was to keep the servers cool. Data centres everywhere are experiencing explosive demand and challenges with cooling.

As we all keep streaming Netflix, sharing on Facebook and buying things from eBay or Amazon, and with more and more cloud computing, big data, artificial intelligence, and exascale supercomputers for scientific research, global data traffic could zoom up to 15.3 zettabytes per year by 2020. It was 4.7 billion zettabytes just four years ago, in 2015. Consumer use accounts for about 30%. Organizations use the rest. Most traffic flows through data centres, which in the US use an estimated 70 billion kilowatt/hours of electricity per year. About half the power is for cooling the servers.

Liquid cooling for data centres has been on the horizon at annual IT conferences for a decade or two but has made slow progress, likely because the idea of mixing liquids with electronics seems counterintuitive. But the approach is beginning to gain traction in North America and elsewhere. The liquid is far more efficient than air for cooling.

At CHIME the high-performance computing nodes reside directly adjacent to the telescope in sealed shipping containers designed to prevent leakage of electromagnetic interference from the servers. Cooling liquid is brought in through tubes and a series of manifolds and disconnects, directly to the hottest components, -- the central processing units (CPUs) and GPUs (graphics processing units). The tubes deliver cooling liquid to a thin copper plate that comes into contact with the hot spots and acts as a heat exchanger. Heated liquid circulates out of the facility for external cooling.

The system was designed by Geoff Lyon, CEO & Chief Technology Officer at CoolIT Systems Inc. He explains that a normal server rack has 80 CPUs but this can be increased to 160-200 CPUs with an advanced liquid cooling solution. The cooling liquid can be as warm as 40 degrees Celsius because the CPU runs happily at 80 degrees. The CHIME system saves 70% on energy compared with air conditioning for a whole server room.

SUBMERSED IN OIL

A similar approach, sealing the liquid inside copper heat exchangers and tubes, is taken by other DC innovators, like Cloud & Heat Technologies GmbH in Dresden, Germany. The more courageous liquid cool solution is when servers are actually submersed in a dielectric, nonconductive synthetic mineral oil blend. Brandon Moore of Green Revolution Cooling (GRC) in Austin, Texas, says it offers about 1200 times the heat absorbing capacity of air cooling.

hrai3b

He says that at one research university in Texas there was a 90% reduction in energy use. The PUE (power use efficiency) was reported at 1.03 - 1.05, compared to 1.7 for an average DC air-cooled system. The project also reduced the power load for computing by 20% and cut construction costs by 60%. Although heavy, the server tanks are relatively small and there are no fans, water loops, or heat exchangers.

By 2017 the company had installed these kinds of liquid cooling tanks in Japan, Australia, Spain, Austria, Costa Rica and the UK. In 2019 they are working on projects in Texas, Utah, Ohio, Spain, Romania and Trinidad & Tobago. “We’ve grown 500% in the past year,” says Moore.

LiquidCool Solutions of Minnesota also plunges electronics into tanks full of oil and is working with Johnson Controls, Siemens, Bonneville Power, and The New York State. On its web site it quotes Wiseguy Research, which predicts that the global data centre liquid immersion cooling market will grow from USD 109.51 Million in 2015, to USD 959.62 Million in 2020 -- a compound annual growth rate of 54.3%.

Data centre heat recovery and reuse, free cooling, ocean cooling and several other approaches are also in use, but liquid cooling appears to be the most efficient solution. It could be a promising new market for some HVAC professionals.