OpenAI’s Billion-Dollar Bet to Power the Next AI Revolution

Tags:

OpenAI is making a billion-dollar bet on power. The company is shifting from renting cloud space to owning industrial-scale compute, vetting sites across the US for its Stargate data centers to fuel the next generation of AI models.

CNBC reports that the buildout marks a turning point for OpenAI, which now treats infrastructure as a strategic pillar of growth, seeking gigawatts of energy and community-backed locations to anchor America’s next AI foundation.

Power, scale, and speed drive OpenAI’s nationwide site hunt

OpenAI has reviewed roughly 800 proposals from cities and utilities eager to host its next-generation data centers since January. The AI company has since narrowed the list to about 20 finalists, spanning the Southwest, Midwest, and Southeast, which are regions rich in available land and access to power.

Executives leading the Stargate initiative say the company is focused less on tax breaks and more on energy scalability and community backing.

“Can we build quickly, is the power ramp there fast, and is this something where it makes sense from a community perspective?” Keith Heyde, OpenAI’s head of infrastructure, told CNBC.

Heyde described the search as a balance between urgency and practicality.

“Perfect wasn’t the goal — the goal was a compelling power ramp,” he said, showing the company’s priority to secure locations that can support the enormous energy demands of future AI systems.

Powering the next frontier of AI

OpenAI has detailed plans for a 17-gigawatt energy buildout to power its next generation of AI systems. Developed in collaboration with Oracle, Nvidia, and Softbank, the project showcases the extensive infrastructure necessary to support the training and deployment of AI at an industrial scale.

To meet these demands, the company is exploring a mix of energy sources, including battery-backed solar farms, refurbished gas turbines, and small modular nuclear reactors. Each site will vary in design, but together they form what OpenAI describes as the backbone of its long-term compute strategy.

The effort is backed by significant capital. Nvidia has committed as much as $100 billion toward the expansion, funding both infrastructure and the purchase of millions of GPUs that will drive the data centers once operational.

Building in-house to cut costs and protect trade secrets

OpenAI executives say that owning its own infrastructure is a way to maintain tighter control over costs, capacity, and intellectual property.

CFO Sarah Friar told CNBC that first-party data centers help cut vendor markups and protect proprietary technology, following the same reasoning that once led Amazon to build AWS instead of relying on external providers.

The approach allows OpenAI to scale independently as model complexity and compute demand continue to rise. Heyde said there is “no playbook” for projects of this size but noted the company is confident it can deliver.

The race for AI supremacy now runs on volts

Across the US, Big Tech is pouring billions into data centers built for sheer power capacity. 

Meta, for instance, is developing what could become the largest data center in the Western Hemisphere in Louisiana. Meanwhile, Amazon and Anthropic are building a 1,200-acre AI campus in Indiana as states fast-track incentives to attract the next wave of industrial-scale compute hubs.

The spotlight has shifted from labs to substations. Power is the new fuel, and the ability to sustain it will decide who leads the world’s next technological leap.

OpenAI’s sweeping infrastructure plans arrive on the heels of a new $500 billion valuation, the highest ever for a private AI firm.

The post OpenAI’s Billion-Dollar Bet to Power the Next AI Revolution appeared first on eWEEK.

Categories

No Responses

Leave a Reply

Your email address will not be published. Required fields are marked *