Energy Supply Is Now the Main Bottleneck Slowing AI Growth

alex2404
By
Disclosure: This website may contain affiliate links, which means I may earn a commission if you click on the link and make a purchase. I only recommend products or services that I personally use and believe will add value to my readers. Your support is appreciated!

The city of Barcelona is home to IESE Business School, where Sampsa Samila serves as academic director of the AI and the Future of Management Initiative. His diagnosis of the AI industry’s core problem is precise: “It’s not the overall supply of energy, but having reliable, firm capacity at the right place and the right time that is in short supply.”

That framing reorders the entire conversation about what limits artificial intelligence. For decades, the constraint was hardware. Early AI systems ran into hard ceilings on processing speed and memory, triggering repeated cycles of stalled progress and collapsed funding — the so-called AI winters. That era is over. Specialized chips, scaled-up data centers, and mass production from companies like Nvidia and AMD have turned raw compute into something that money can simply purchase.

The new limit is electrical infrastructure.

Constant demand, not occasional bursts

The shift matters because of how AI is actually deployed. Training large language models consumes significant power, but those runs are infrequent. What has changed is the operational layer — chatbots, search tools, image generators, and autonomous agents running continuously, drawing electricity around the clock. According to the report, Samila points specifically to newer “reasoning” systems, which spend more time working through answers before responding, as a force pushing energy use deeper into everyday operations rather than isolated training events.

The scale is significant. The International Energy Agency expects global data center electricity consumption to more than double by the end of the decade, reaching levels comparable to major industrial economies. In parts of the United States, data centers already consume as much power as heavy industry.

Infrastructure built for a slower era

Juan Arismendi-Zambrano, an assistant professor at University College Dublin’s Michael Smurfit Graduate Business School, places the problem in sharper relief. Power grids were engineered for gradual, predictable growth. Large AI campuses arrive almost overnight. Grid upgrades and government approvals move at institutional pace. The gap between those two speeds is where the real bottleneck lives.

His view aligns with Samila’s: the problem is not a global shortage of electricity in aggregate. It is local. The right amount of power, delivered to the right place, at the right moment — that combination is what the current infrastructure struggles to guarantee.

The physical nature of this constraint is what makes it harder to engineer around than the compute problem ever was. A faster chip can be designed and manufactured. A transmission line requires land approvals, regulatory clearance, construction timelines, and capital that flows through entirely different channels than a semiconductor budget.

AI’s previous bottleneck responded to money. This one responds to geography, permitting, and the physics of electrical grids — systems that were never designed with city-sized, always-on digital loads in mind.

Photo by Philipp Fahlbusch on Pexels

This article is a curated summary based on third-party sources. Source: Read the original article

Share This Article