The rapid expansion of AI and generative AI (GenAI) workloads could see 40% of data centers constrained by power shortages by 2027, according to Gartner. Driven by the need for immense computing power, AI data centers will likely require 160% more electricity over the next three years, surpassing the capabilities of utility providers to meet the demand in time.
According to Gartner, AI-focused data centers will consume approximately 500 terawatt-hours (TWh) per year by 2027, nearly doubling their current power consumption, from 260 TWh in 2024. The surge is attributed to the increased deployment of large-scale language models (LLMs) and complex algorithms that require massive data processing capabilities.
“The explosive growth of hyperscale data centers for GenAI applications is creating an unprecedented demand for power,” Bob Johnson, VP Analyst at Gartner said in the research firm’s latest report. “This demand is outpacing utilities’ ability to expand their capacity, leading to potential shortages that could restrict the growth of AI and other power-intensive applications from 2026 onward.”
An AI hyperscale data center can consume as much as 100 MW of power.
The heightened demand is set to pose challenges for tech giants and cloud providers who rely on consistent energy to maintain operations, and experts predict these constraints could affect innovation and the rollout of future AI applications, the report added.
The demand-supply gap
The total estimated power demand for the collection, analysis, computation, storage, and communication from global information and communication technologies (ICT) that includes data centers is growing significantly faster than the total electrical energy production.
“Total ICT electricity demands will exceed 9% of total global energy production by 2030, up from less than 3% today,” Johnson pointed out. “To satisfy this increase in demand, ICT applications would have to take considerable amounts of electricity from other areas, such as residential, commercial or industrial users.”
To put it in perspective, the Gartner report said that the total demand for ICT power is projected to grow at a 25% compound annual growth rate (CAGR) through 2030, while global power generation will only increase at a CAGR of 3%.
“This means that while total ICT power demand accounts for less than 3% of the total power generation in 2024, it will account for over 9% by 2030,” the report said.
Specifically, from the data center perspective, in 2021, the additional demand for ICT accounted for a manageable 4% of the additional generation capacity globally. By 2023, driven by the increase in hyperscale data centers, incremental ICT demand was 20% of total new generation, and this is projected to exceed 70% by 2030.
The International Energy Agency (IEA) projects in its 2024 analysis that global electricity demand from data centers will grow by over 75% between 2022 and 2026, with a high-end estimate suggesting a possible increase of 128%.
French multinational Schneider Electric, that earns one third of it’s revenue from data center business, also echoes the same sentiment. In its whitepaper it has warned that the demands of power and cooling for AI are beyond what standard data center designs can handle and says new designs are necessary.
Rising costs and energy security concerns
As power demands increase, electricity costs are expected to rise, impacting data center operators’ expenses and creating ripple effects across the AI industry. Some tech firms are already securing dedicated energy agreements to ensure a reliable supply, but even so, rising prices are becoming inevitable.
“Organizations that heavily rely on AI need to prepare for higher operational costs as data center power expenses increase,” Johnson added. “Long-term power contracts will be crucial, as will exploring alternative power sources or efficiency-driven solutions.”
According to Johnson, product leaders should evaluate the impact of potential power shortages on their products and services by developing “realistic alternative scenarios that reflect limitations on available power.”
“Include significant cost increases for data center services when developing plans for new products and services, Johnson added. “Look for alternative approaches that require less power while achieving the same market impact.”
To offset future shortages, Gartner suggests AI companies consider energy-efficient models such as edge computing and smaller language models, which require less power.
A call for strategic energy planning
For years, data center operators have chosen sites based on factors such as financial incentives and proximity to data hubs, assuming local utilities would provide the necessary power.
However, with the rise of GenAI and large language models, data centers are growing larger and demanding more power, the report said. “This shift is challenging the assumption that power will always be readily available, prompting some major operators to move away from relying on public utilities and instead build their own power generation capacity.”
With global power grids struggling to keep up, Gartner recommends that organizations prepare by re-evaluating sustainability plans and negotiating long-term energy contracts where possible. As AI advances, strategic energy planning will play a critical role in ensuring that these technologies remain “scalable and sustainable amid escalating demands.”
Source:: Network World