Sustainability-in-Tech : UK Data Centre Cuts AI Power Use By 40 Per Cent
Sustainability-in-Tech : UK Data Centre Cuts AI Power Use By 40 Per Cent
A UK data centre has demonstrated that artificial intelligence infrastructure can reduce its electricity consumption by up to 40 per cent in response to grid signals without interrupting critical computing workloads.
A UK-First Trial Of Flexible AI Infrastructure
The demonstration took place at Nebius’s “AI Factory” data centre near London and was conducted in partnership with National Grid, Emerald AI, the Electric Power Research Institute (EPRI), and NVIDIA. The project was designed to test whether high-performance AI infrastructure could act as a flexible energy asset rather than a fixed electricity load.
Over five days in December 2025, a cluster of NVIDIA Blackwell Ultra GPUs was subjected to more than 200 simulated grid events. These signals instructed the facility to adjust its electricity consumption under different conditions, including scenarios where the system had little or no advance warning.
According to the project’s white paper, the cluster achieved full compliance with all requested power targets and ramp-rate requirements while maintaining normal operation of key workloads. National Grid Partners described the results as evidence that high-performance AI infrastructure can operate as “a power-flexible, grid-responsive asset without disrupting mission-critical workloads.”
How The System Reduced Power Demand
The trial involved a 130 kW compute cluster running realistic AI training workloads based on open models such as Llama, Qwen and GPT-OSS. The cluster was deliberately kept busy throughout the experiment in order to simulate real production conditions.
Rather than switching servers off, the system reduced electricity consumption by dynamically managing how GPU workloads were scheduled and executed. Lower-priority tasks could be paused, delayed or temporarily slowed, allowing the cluster’s power draw to fall when grid operators requested a reduction.
This approach relies on the nature of many AI workloads. Model training and fine-tuning often run for long periods and include natural pause points, known as checkpoints, where processing can be safely interrupted without losing progress.
By contrast, latency-sensitive tasks such as inference can continue running normally while background training workloads absorb most of the power adjustments.
The orchestration software coordinating this behaviour was provided by US-based AI infrastructure company Emerald AI. Its platform interprets grid signals and automatically adjusts computing workloads so that the data centre can respond quickly to changes in electricity demand.
Testing Real-World Grid Events
Some of the simulated grid signals included immediate reduction requests with no ramp-down period, forcing the system to respond rapidly. Others provided advance warning and allowed the cluster to gradually reduce its consumption.
The trial also modelled real electricity demand patterns. One scenario simulated the well-known “TV pickup” effect in the UK, where millions of households switch on kettles during the half-time break of major football matches or television programmes.
These sudden surges can add around one gigawatt of demand to the grid within minutes. During the simulation, the AI cluster automatically reduced its power consumption as demand increased, demonstrating how data centres could help stabilise electricity networks during peak usage.
National Grid Partners president Steve Smith said the results challenge assumptions about the impact of AI infrastructure on electricity systems. As he explained, “as the UK’s digital economy accelerates, there’s concern that datacentres could add pressure to an already constrained system. This trial proves the opposite can be true.”
He added that the results suggest high-performance computing facilities “don’t have to place additional strain on the grid,” but could instead contribute to more flexible and responsive electricity systems.
Why AI Power Demand Is Becoming A Major Issue
The experiment takes place against the backdrop of rapidly growing electricity demand from AI computing. Training large AI models requires enormous GPU clusters operating continuously, and global data centre power consumption is expected to rise significantly as AI adoption expands.
Grid operators are increasingly concerned that new data centres could strain already constrained electricity systems. In the UK, demand for grid connections has grown rapidly in recent years as developers race to build AI infrastructure.
Traditional data centres are usually treated as “firm loads”, meaning the electricity system must assume they will draw their full power requirements at all times. The London trial explored an alternative model in which data centres act as flexible loads that can temporarily reduce consumption during periods of grid stress.
If implemented at scale, this approach could make it easier for electricity networks to accommodate the growth of AI infrastructure while maintaining grid stability.
What Does This Mean For Your Business?
For businesses building or using AI infrastructure, the trial highlights a possible change in how data centres can interact with energy systems.
AI computing has often been criticised for its high energy consumption, particularly as demand for generative AI services continues to grow. The London trial suggests that AI infrastructure may also offer new tools for managing electricity demand more intelligently.
Flexible computing loads could allow data centres to reduce power consumption during peak demand periods or when renewable energy supply is limited. This could help organisations balance sustainability goals with the growing need for high-performance computing.
However, the model also introduces new operational considerations. Running AI infrastructure as a flexible grid resource requires sophisticated workload management systems capable of pausing or rescheduling non-critical tasks without affecting service levels.
As AI becomes more deeply integrated into business operations, the ability to manage computing workloads in ways that support both performance and energy resilience may become an important part of future data centre strategy.



