AI boom thrusts Europe between power-hungry data centers and environmental goals

Technology

A large hallway with supercomputers inside a server room data center.
Luza Studios | E+ | Getty Images

The boom in artificial intelligence is ushering in an environmentally conscious shift in how data centers operate, as European developers face pressure to lower the water temperatures of their energy-hungry facilities to accommodate the higher-powered chips of firms such as tech giant Nvidia.

AI is estimated to drive a 160% growth in demand for data centers by 2030, research from Goldman Sachs shows — an increase that could come at a cost to Europe’s decarbonization goals, as the specialized chips used by AI firms are expected to hike the energy use of the data centers that deploy them.

High-powered chips — also known as graphics processing units, or GPUs — are essential for training and deploying large language models, which are a type of AI. These GPUs need high density computing power and produce more heat, which ultimately requires colder water to support reliable cooling of the chips.

AI can consume 120 kilowatts of energy in just one square meter of a data center, which is equivalent to the power consumption and heat dissipation of around 15 to 25 houses, according to Andrey Korolenko, chief product and infrastructure officer at Nebius, who referred specifically to the deployment of Nvidia’s Blackwell GB200 chip.

“This is extremely dense, and from the cooling standpoint of view you need different solutions,” he said.

The problem we’ve got with the chipmakers, is AI is now a space race run by the American market where land rights, energy access and sustainability are relatively low on the pecking order, and where market domination is key,” Winterson told CNBC
Michael Winterson
chair of the EUDCA

Michael Winterson, chair of the European Data Center Association (EUDCA), warned that lowering water temperatures will eventually “fundamentally drive us back to an unsustainable situation that we were in 25 years ago.”

“The problem we’ve got with the chipmakers is [that] AI is now a space race run by the American market where land rights, energy access and sustainability are relatively low on the pecking order, and where market domination is key,” Winterson told CNBC.

Major equipment suppliers in Europe say that U.S. chip designers are calling on them to lower their water temperatures to accommodate the hotter AI chips, according to Herbert Radlinger, managing director at NDC-GARBE.

“This is shocking news, because originally everybody from the engineering side expected to go for liquid cooling to run higher temperatures,” he told CNBC, referring to the technology of liquid cooling, which is said to be more efficient than the more traditional method of air cooling.

‘Evolution discussion’

Energy efficiency is high on the European Commission’s agenda, as it seeks to reach its goal of reducing energy consumption by 11.7% by 2030. The EU predicted in 2018 that energy consumption of data centers could rise 28% by 2030, but the advent of AI is expected to boost that number two or threefold in some countries.

Winterson said that lowering water temperatures is “fundamentally incompatible” with the EU’s recently launched Energy Efficiency Directive, which established a dedicated data base for data centers of a certain size to publicly report on their power consumption. The EUDCA has has been lobbying Brussels to consider these sustainability concerns.

Energy management firm Schneider Electric engages often with the EU on the topic. Many of the recent discussions have focused on different ways to source “prime power” for AI data centers and for the potential for more collaboration with utilities, said Steven Carlini, chief advocate of AI and data centers and vice president at Schneider Electric.

European Commission energy officials have also had exchanges with Nvidia to discuss energy consumption and the use of data centers with regard to the effectiveness of power use and that of chipsets.

CNBC has approached Nvidia and the Commission for comment.

“Cooling is the second-largest consumer of energy in the data center after the IT load,” Carlini told CNBC in emailed comments. “The energy use will rise but the PUE (Power Usage Effectiveness) may not rise with lower water temperatures despite the chillers having to work harder.”

Schneider Electric’s customers that are deploying Nvidia’s Blackwell GB200 super chip are asking for water temperatures of 20-24 degrees Celsius or between 68 and 75 degrees Fahrenheit, Carlini said.

He added that this compares to temperatures of around 32 degrees Celsius with liquid cooling, or of around 30 degrees Celsius that Meta has suggested for the water it supplies to the hardware.

Ferhan Gunen, vice president of data center operations for the U.K. at Equinix, told CNBC that there are a number of concerns about AI that Equinix has been discussing with its customers.

“They want to increase the density of their servers, which is, they want to have higher-power-using chips, or they want to have more servers,” she said, adding that the shift is not “clear cut.”

“It’s really an evolution discussion more than anything,” Gunen said.

Nvidia, which declined to comment on the cooling requirements of its chips, announced a new platform for its Blackwell GPUs earlier this year. It said that the architecture would enable organizations to run real-time generative AI on large language models at up to 25 times less cost and energy consumption compared to earlier technology.

Liquid cooling will require a “reconfiguration,” Gunen explained, adding that new data centers are already coming ready with this technology. “Yes, higher density will mean more power use, and will also mean more cooling requirement. But then the technology is changing, so you’re doing it differently. That’s why there is a balance in all of this,” she said.

Race for efficiency

Nebius, which has around $2 billion in cash on its balance sheet after splitting from Russia’s Yandex, has said it will be one of the first to bring Nvidia’s Blackwell platform to customers in 2025. The firm has also announced plans to invest more than $1 billion on AI infrastructure in Europe by the middle of next year.

Nebius’ Korolenko said liquid cooling is a “first step,” where cost of ownership will initially be worse before improving over time.

“There’s a big push to deliver, but at the same time, when you go to scale, you will want to have the ability to choose, to be economical and not sacrifice too much. Power efficiency is important for the running costs. It’s always a high priority,” Korolenko said.

Even before a boom in demand for AI applications hit the market, the data center industry in Europe was struggling to keep pace with the growing digital sector.

Sicco Boomsma, managing director of ING’s TMT team, said those involved in the market are “very sensitive to power” and that while Europe’s focus is on infrastructure, the U.S. has focused more on expanding assets in Europe where power is available.

“There’s a tremendous amount of data center operators also coming from the U.S. that are aligning in order to ensure that their data center infrastructure is in line with the various goals that the EU has as well, such as being carbon neutral, such as being efficient, on water utilization, maintaining biodiversity.”

“It is a sort of a race where they want to demonstrate that their knowledge is leading to super efficient infrastructure,” he said.

Articles You May Like

Here’s How Nuclear Clocks Could Redefine Time and Reshape Modern Technology
Norway hits pause on controversial deep-sea mining plans
Tesla adds direct charge port defrosting option just in time for winter
Ireland vs Australia: Resurgent Wallabies denied winning finish to 2024
Ralph Fiennes on Conclave: ‘It’s not a facile takedown of the Catholic Church’