Radical Talks

Powering the AI Era

Featured speakers: Varun Sivaram, Founder and CEO of Emerald AI

http://Radical%20Talks%20Episode%20Initial%20Concepts

From our latest Radical Talks episode with Varun Sivaram, Founder and CEO of Emerald AI

Everyone’s racing to build the most powerful AI. Fewer people are asking whether the lights will stay on. The U.S. wants to plug in 50 gigawatts of AI data centers in the next three years. The grid can realistically support about half that. 

In a recent Radical Talks conversation, Varun Sivaram, founder and CEO of Emerald AI, former senior energy official in the Biden White House, and former CTO of India’s largest solar company, makes the case that energy has become the defining challenge of the AI revolution. And unlike the chip problem, it isn’t solved by manufacturing more of something. It requires rethinking the relationship between AI infrastructure and the grid itself

The Gap Between Demand and Supply

The numbers are stark. The United States wants to plug in 50 gigawatts of AI data centers in the next three years. The grid can realistically support about half that. China, by contrast, will have 400 gigawatts of spare capacity by 2030. As Sivaram puts it: “They don’t have an energy problem. They have a chips problem. We have an energy problem, not a chips problem.”

Data centers already represent roughly 5% of America’s electricity consumption. By 2030, that figure could reach 9–17%. The scale of new demand is real — and the grid is not equipped to absorb it quickly. Building more generation and transmission alone won’t get us there fast enough. There has to be a smarter path.

Three Levers of Flexibility

Emerald’s platform, called Conductor, orchestrates flexibility across three dimensions.

Temporal flexibility: slowing or rescheduling computations with some delay tolerance — fine-tuning runs, batch inference jobs — by adjusting GPU clock frequencies or pausing non-urgent workloads.

Spatial flexibility: routing inference queries across locations at millisecond speed. If the Phoenix grid is stressed on a hot afternoon, a query shifts to Chicago or Virginia. No other economic actor can relocate activity at the speed of light. Data centers can.

Resource flexibility: deploying on-site batteries or generators to offset grid draw at peak moments. As one tool among three, coordinated by software, storage becomes a powerful complement rather than a prohibitively expensive standalone solution.

Together, these levers give grid operators reliable, auditable demand response and give compute customers the performance and continuity they need.

Proven in the Field

Emerald has conducted commercial demonstrations at five data centers across three continents. In London, Conductor was put through 200 grid stress tests with the National Grid utility and met every one with full compliance, including a 4 a.m. simulated lightning strike handled autonomously, and a Premier League halftime tea-kettle spike absorbed without disrupting high-priority workloads.

National Grid’s response: “If all AI data centers acted this way, we would be tripping over ourselves competing to attract them.”

Later this year, Emerald plans a 100-megawatt commercial-scale demonstration in Virginia with NVIDIA, PJM, Dominion Energy, and Digital Realty — designed to prove the technology at full AI factory scale.

Two Futures

Looking ahead, Sivaram sees two diverging paths. In the first, data centers become islands — off-grid, fortress systems that hollow out the public grid and raise costs for everyone. In the second, the data grid and the physical power grid fuse. AI factories become the most stabilizing force on the grid, keeping energy costs low while accelerating AI deployment.

“I’m less concerned with where Emerald sits,” Sivaram says. “I’m more concerned with whether, as a society, we’ve managed to fuse those two grid networks.”

This conversation is a reminder that AI progress isn’t just about algorithms. It’s about energy, infrastructure, and the industrial systems that translate intelligence into action.

This post is based on insights from Radical Talks, a podcast from Radical Ventures exploring innovation at the frontier of AI. For more conversations with leaders in AI, subscribe wherever you get your podcasts.