Liquid cooling is moving from niche to necessity as artificial-intelligence workloads push data-centre rack densities to unprecedented levels, with some systems projected to reach 240 kilowatts per rack as early as 2026, according to Schneider Electric.
The shift underscores how rapidly AI is reshaping the physical backbone of the digital economy. Since OpenAI’s ChatGPT propelled AI into the mainstream in late 2022, the technology has spread across sectors from academia and healthcare to finance and manufacturing. The next phase, Schneider Electric says, will be defined less by training ever-larger models and more by AI inferencing, the real-time application of models at scale, placing intense demands on data-centre infrastructure.
“The AI disruptions of recent years are only the prologue to what 2026 promises: full integration of AI into data-centre operations and builds,” the company said in a forward-looking industry outlook. As workloads grow denser, traditional air cooling is reaching its limits, accelerating the adoption of direct-to-chip and immersion liquid cooling systems that can efficiently remove heat from high-performance processors.
Read also: Schneider expands liquid cooling after Motivair acquisition
“AI is no longer just a tool; it is a transformative force that demands the right infrastructure to unlock its full potential,” said Ajibola Akindele, country president for Schneider Electric West Africa. Organisations that invest in AI-ready data centres, featuring advanced cooling, modular designs and energy-efficient power systems, will gain a competitive edge, he said.
AI adoption is already widespread. According to McKinsey’s latest State of AI survey, 78% of organisations now use AI in at least one business function, up from 72% in early 2024 and 55% in 2023. While sales and marketing remain leading use cases, deployment is expanding into manufacturing, healthcare, finance, and data centers themselves, where AI is used for predictive maintenance, cooling optimisation and energy management.
The rise of so-called “AI factories” is further driving infrastructure change. Unlike traditional data centres focused on storage and general computing, AI factories are designed to generate intelligence, training, fine-tuning and running models to produce insights or revenue. Inferencing workloads can range from under 20 kW per rack for compressed models to as much as 140 kW for advanced, agentic systems, Schneider Electric estimates.
By 2030, roughly a quarter of new data-centre builds are expected to exceed 100 kW per rack, with another half falling between 40 kW and 80 kW. Next-generation hardware platforms, such as Nvidia’s Rubin CPX GPUs and Vera CPUs, promise massive performance gains, but also intensify power and cooling requirements.
To meet these demands, operators are increasingly turning to digital twins — virtual replicas of physical systems, to design and optimise electrical and cooling infrastructure before deployment. Robotics are also gaining ground, automating tasks from server installation to liquid-cooling system management.
Read also: Schneider Electric warns Nigeria’s data centres must become AI-ready
Not all investments will be greenfield. Smaller operators are pursuing brownfield retrofits, upgrading existing facilities with higher-capacity racks, power distribution units and bolt-on liquid cooling to support AI workloads without building entirely new sites.
Power strategy remains a critical constraint. While renewables currently supply about 27% of data-centre electricity, operators are diversifying into natural-gas turbines with carbon capture, battery storage and alternative fuels to ensure resilience as AI demand surges.
By 2026, Schneider Electric says, AI will move from disruptive technology to foundational infrastructure, and liquid cooling will be at the heart of that transition.


