The unprecedented global heatwaves during the summer of 2022 brought questions on the importance and challenges around cooling data centres to the forefront, as global warming is becoming the ‘new normal’ and the development of digital infrastructure is accelerating.
Between June to August 2022, parts of Europe and Northern Africa experienced unprecedented heatwaves. The highest recorded temperature was recorded in Portugal – 47°C (116.6°F). In addition to threatening life, these soaring temperatures also disrupted local infrastructure.
In France, there were problems with overhead cables that powered train lines and fires near the edge of train tracks. In Spain and Italy, the use of air conditioning was limited as nuclear plants had to reduce output due to the scarcity of water. And in the UK, Oracle’s and Google’s data centres experienced outages when the temperature in London hit 40.3°C (104.5°F).
At the time, Oracle reported: “As a result of unseasonal temperatures in the region, a subset of cooling infrastructure within the UK South (London) Data Centre has experienced an issue. As a result, some customers may be unable to access or use Oracle Cloud Infrastructure resources hosted in the region.”
Meanwhile, Google announced: “There has been a cooling-related failure in one of our buildings that hosts zone europe-west2-a for region europe-west2.” The company added that it was “working hard to get the cooling back online and create capacity in that zone”.
A new (hot) world
Members from both the scientific and environmental communities say should expect that summers like 2022’s will become the norm now. So how likely are outages, like those Oracle and Google suffered, will happen again?
“Unfortunately, as the global temperatures rise, data centre outages due to failing cooling systems are more likely to occur,” said Ozgur Duzgunoglu, head of engineering and design at Telehouse. “As the old commercial buildings in London were designed with temperatures of around 29°C to 35°C in mind, the significant temperature rises to 41°C during the recent heatwave has increased the risk of overheating, and made it harder to keep the buildings cool.”
With global warming, and its impact on business continuity and uptime, likely to stay, Duzgunoglu says that the data centre industry needs think about solutions. One possibility is embedding liquid cooling into the design of new facilities.
“The way the technology works helps keep temperatures down as the liquid used for immersion cooling is a more efficient conductor of heat than air, helping to minimise and prevent blackouts,” he says. “If the industry can move away from the traditional way of doing things, the outage incidents like the ones of Google and Oracle’s can be reduced.”
Liquid cooling aside, some of the other most promising technologies and innovations in cooling includes chassis-level precision immersion cooling.
David Craig, chief executive of Iceotope Technologies, says this system “operates within the water loop at a higher temperature, which enables it to cool more easily at the higher end of the design range”.
“Chassis-level precision immersion cooling techniques circulate small volumes of a harmless dielectric compound across the surface of the server,” Craig explains. “This removes nearly 100% of the heat generated by the electronic components. It also eliminates the requirement for server fans and air-cooling infrastructure.”
Craig’s firm has recently closed a £30 million funding round with private equity firm ABC Impact. So we asked him whether there is enough investment and research into data centre cooling technology.
“The simple answer is no, we are not seeing enough investment,” he says. “We are, however, seeing a huge amount of interest and more people are taking it very seriously. Right now, there is not a holistic enough understanding of the total benefits of liquid cooling.”
Those who fund and invest in data centres seem to be sticking to traditional models, which, combined with risk aversion, is slowing down the adoption of new technologies.
“Ultimately, the onus rests with us to evangelise, educate and demonstrate the benefits of liquid cooling and make the choice an easy one,” Craig adds.
Direct-to-chip cooling is another promising area. This technology involves using flexible tubes to deliver dielectric fluid directly to processing chips. The fluid absorbs heat from the chip and turns to vapor.
“Direct-to-chip cooling will continue to grow in acceptance as CPUs and GPUs are getting hotter with each generation, and power per rack is not changing due to various limitations within data centres,” says Vik Malyala, president of EMEA at Supermicro. “It’s easier to deploy in existing data centres, provided they have chilled water access.”
Colder centres
Individual solutions aside, the need for greater cooling will likely raise further questions about building new types of centres. Microsoft, for example, began trials of Project Natick, its undersea data centre project powered by offshore renewable energy, in 2020.
“Inevitably, we’ll see new types of data centres built around liquid cooling. However, it’s unlikely we will see wholly liquid cooled data centres in the near term,” says Craig. “Data centre operators have invested considerable sums of money into their data centre environments – they can’t and won’t abandon them in the immediate future.”
This sentiment was echoed by Thierry Chamayou, Schneider Electric’s vice-president for cloud and service providers segment EMEA, who said: “With power and rack densities increasing at a rapid pace, it’s likely that we’ll continue to see new innovations within traditional data centre deployments. But one cannot argue that underwater facilities may also become a reality.”
Chamayou says the US-based Subsea Cloud is on target to submerge its first underwater data centre by the end of this year.
“The company states it can deploy 1MW of capacity for as much as 90% less cost than a land-based facility and with 40% less carbon dioxide emissions!” he says. “If that’s the case, then there’s no doubt that underwater data centres may well become more of a reality for businesses. But I believe that will only be possible for those operating near coastlines and lakes, or fjords in the case of the Nordics.”
The environmental impact of data centres cannot be ignored, but both water- and air-cooling solutions have their own shortcomings – neither solution is perfect.
“With air conditioning increasing electricity usage and therefore greenhouse gas emissions, many operators have turned to using water to ensure that data centres are kept at an optimal temperature,” says Phil Bindley, managing director of cloud and security at Intercity Technology. “However, water is now becoming a scarce resource, with many areas of the country suffering from drought, which means using water to cool data centres isn’t the most sustainable of options.”
Sharing some statistics on the effects these solutions have on efficiency and data consumption, Morten Steen Mjels, country product manager for servers and workstations at ASUS, says: “Currently, the average efficiency of data centres in 2021 was 1.57 power usage efficiency [PUE] and 63.5% data centre infrastructure efficiency [DCIE].
“Data centres currently take up 1% to 2% of the world’s electricity, and is expected to rise in 2030 to 8%. Therefore, any advances we can make to help the efficiency of data centres is vital for the environment.”
Recycling heat
The redistribution of heat created by data centres and their cooling systems means there is a potential for data centres to reduce energy use in their local areas.
“The benefits of liquid cooling are apparent in high-density facilities,” says Jakub Wolski, data centre strategy and business development leader at Trend, a subsidiary of Honeywell. “These data centres need an efficient means of cooling for powerful and densely packed hardware.
“Moreover, the heat energy extracted by liquid cooling can be repurposed to heat buildings or other upcycling possibilities. This lowers the utility energy demand and the overall energy and carbon footprint of the data centre and the community it serves.”
With the foundations laid for solutions to the data centre cooling conundrum, the question of whether data centre operators are ready and able to meet these challenges remain. Perhaps a better way to tackle this crisis is simply reduce the heat they generate.
“As data centres expand, much of the focus in recent years has been devoted to cooling technology. However, it’s important to remember that while better cooling provides symptomatic relief, it doesn’t necessarily address the root cause of the problem, which is heat generation,” says Marcin Bała, chief executive of Salumanus. “If you can reduce power consumption, you can directly reduce the amount of heat output, and subsequently the amount of cooling capacity required.”
Lowering the power demands of data centres can be achieved by reducing their wattage. For example, Salumanus has developed a third-generation optical transceiver module uses 2.5–2.7W of power, down from 3.5W in previous models.
Bała adds that “data centre managers should look at ways of reducing the total number of devices on the network altogether”.
But Chamayou reminds us that there will never be a single solution to the problem of cooling data centres.
“We have to remember that with data centres, no one size will fit all,” he says. “And when selecting a cooling technology, we must consider the implications on the wider business and its customers.”