BLOG POST
Cloud Repatriation Fuelling the Liquid Cooling Boom in Europe
The demands of artificial intelligence (AI), machine learning, and other compute-intensive workloads have fundamentally transformed how data centers are designed, built, and cooled. This shift has placed IT teams in the driver's seat when it comes to cooling decisions—marking a significant departure from three decades of established practices.
For decades, IT workloads were relatively consistent, allowing data center designers and operators to focus primarily on physical infrastructure like power supply and cooling. That model has dramatically shifted over the past year. Today, IT requirements are the starting point for any data center project. Designers must now ask clients for detailed cut sheets outlining the specific workloads and technologies to be deployed, as these directly determine the appropriate cooling solutions.
Cooling strategies have evolved beyond traditional air cooling, with various forms of liquid cooling emerging as essential technologies. Data centers are increasingly adopting hybrid cooling strategies to support diverse workloads. Facilities may integrate immersion cooling, air-cooled racks, and direct-to-chip cooling systems within the same data hall to optimize thermal management for specific applications.
While air cooling remains an important part of many data center strategies, liquid cooling is no longer a niche solution. Workloads such as AI and advanced machine learning—which can push power densities above 80 kW per rack—require highly efficient thermal management. Next-generation CPUs are expected to reach power levels of up to 2,000 watts per chip. In response, liquid cooling solutions, particularly direct-to-chip systems, have emerged as leading technologies capable of managing these extreme heat loads.
The Open Compute Project estimates that over 40% of new data center builds in the next three years will incorporate some form of liquid cooling—signaling a major evolution in design strategies. This shift is fueled by the unique ability of liquid cooling systems to manage the intense heat generated by AI hardware, including GPUs and next-generation CPUs.
Liquid cooling is no longer just a “nice to have.” In March 2024, AI chips market leader Nvidia announced that the upcoming iteration of their server family would require liquid cooling. The Nvidia GB200 superchip mandates direct-to-chip cooling due to its high thermal design power, which is becoming a critical requirement for high-density clusters. Nvidia is partnering with companies like Vertiv, Supermicro, and nVent to develop tailored liquid cooling solutions.
This growth in liquid cooling will not be limited to the hyperscale cloud providers. Enterprise data centers are making a comeback in sectors such as pharmaceuticals and financial services. Larger companies are building new on-premises data centers—a trend dubbed “cloud repatriation”—to retain control over their AI investments and big data. This shift aims to safeguard intellectual property, ensure regulatory compliance, enhance data security and maintain data sovereignty.
America’s new stance toward Europe is likely to accelerate this trend across the continent, driving increased demand for advanced cooling solutions. Companies like Schneider Electric are expanding their liquid cooling portfolios to meet this growing need. Their acquisition of Motivair Corp. strengthens their ability to deliver end-to-end cooling infrastructure—strategically positioning them to support European data centers in the transition to liquid and hybrid cooling systems.