Goldman Sachs: AI Investment Shifts to Data Centres

alex2404
By
Disclosure: This website may contain affiliate links, which means I may earn a commission if you click on the link and make a purchase. I only recommend products or services that I personally use and believe will add value to my readers. Your support is appreciated!

Artificial intelligence investment is shifting away from software experiments and toward the physical infrastructure required to run AI systems at scale, according to Goldman Sachs analysis.

The firm describes the current market movement as a “flight to quality” — investors focusing on companies that own and operate large data centres and computing hardware, while firms offering narrow AI tools attract less attention.

Goldman Sachs Research estimates that AI workloads could account for roughly 30% of total data centre capacity within the next two years, driven by growth in cloud services and enterprise applications.

Training large models requires thousands of chips running in parallel for extended periods. Inference — generating responses or predictions — demands steady computing power whenever services are live. Cloud providers are now expanding capacity at a pace not seen during earlier phases of cloud computing, with hyperscale firms investing tens of billions of dollars annually in new facilities and hardware. Networking systems are also being expanded to support this load.

Power Demand Is the Binding Constraint

Energy supply is becoming the central pressure point. Goldman Sachs Research estimates global data centre power demand could rise approximately 175% by 2030 compared with 2023 levels, largely driven by AI workloads. The firm says that increase would be roughly equivalent to adding the electricity consumption of another top-10 power-consuming country to the global grid.

Rising demand is already influencing where new facilities are built. Large data centres are being sited near stable energy sources and high-capacity fibre networks. Some companies are placing AI training clusters in remote areas where land and electricity are easier to secure.

Cooling systems and geographic location can affect energy use and water consumption as significantly as hardware efficiency, according to academic research cited in the analysis.

Infrastructure Takes Years to Build

Construction of large data centres involves complex supply chains, land acquisition, grid connections, and long-term energy agreements. Shortages of electrical equipment and delays in grid expansion are already slowing projects.

Those constraints help explain why investors are reassessing valuations. During the first wave of generative AI adoption, many companies saw market value rise simply by associating themselves with AI. That dynamic is changing as investors examine which companies hold the infrastructure and revenue models needed to support long-term deployment.

Data centre operators and chip manufacturers sit at the base of that structure. Their services are required regardless of which AI applications ultimately gain traction — a pattern that mirrors earlier computing waves, where companies controlling underlying infrastructure captured more durable revenue than software platforms built on top of it.

Governments and industry planners are beginning to treat energy capacity and grid expansion as direct inputs to AI strategy, not secondary concerns.

Photo by Brett Sayles on Pexels

This article is a curated summary based on third-party sources. Source: Read the original article

Share This Article