OpenAI CEO Sam Altman took to the stand in front of Congress, testifying on the future of AI regulation at Capitol Hill. During his hearing, he spelled out the truth about the limits of how far we can take AI: “The cost of AI will converge to the cost of energy.”
AI is often framed as a purely digital phenomenon, operating seamlessly in the intangible realm of codes and algorithms. But behind every image generated, every response crafted, lies a significant and measurable energy cost. The technology we all use relies on minerals, chips, semiconductors, and data centers where your data is being churned and processed. Technology requires energy to power it, and the extraction of scarce minerals to build it. So, as we think about accelerating technology, what lies ahead is not just a computational challenge, but an infrastructural and ecological one.
Despite AI’s promise as a technology with unlimited and infinite potential, there is a very real limiting factor to its growth. In his testimony, Altman skipped the fluff: “Eventually, chips, network gear… will be made by robots, and we’ll make that very efficient and cheaper and cheaper, but an electron is an electron.” This is a fundamental economic principle that will shape AI’s future. As AI manufacturing processes become increasingly automated and optimized, the variable costs of hardware production will steadily decline. What remains immutable is the physics of computation itself: the energy required to power these systems. In a mature AI economy, the marginal cost of intelligence will approach the marginal cost of electricity. This creates a direct relationship between energy innovation and AI capabilities; regions with abundant, reliable, and affordable energy will gain decisive advantages in computational power.
Energy is the primary limiting factor to innovation. Some estimates suggest the U.S. will need up to 90 additional gigawatts, about 90 nuclear power plants to power data centers. The nations or companies that can generate clean energy at scale will effectively set the ceiling on what’s possible in artificial intelligence. There are two methods the U.S. must adopt to build an energy-efficient AI infrastructure and lead the global AI race: measure AI emissions at scale and treat energy policy and AI policy as intertwined, as opposed to separate interests.
Carbon footprint represents the most comprehensive metric we have for quantifying the true future cost of AI systems. However, OpenAI and other major companies do not disclose how carbon-intensive their models are, leaving users in the dark about their AI energy. Despite the lack of transparency, data scientists have found nearly precise ways to estimate the carbon cost of various popular models. In the first week after OpenAI released its image generation tool, users created 700 million images. Each image uses about 7 watt-hours of electricity. Together, that’s more than 5 million kilowatt-hours—roughly the same amount of energy 24,000 U.S. homes use in a week. Organizations that measure and optimize for carbon efficiency now will gain crucial advantages as regulations tighten and energy costs fluctuate. As energy becomes the limiting factor in AI advancement, carbon intensity directly translates to economic competitiveness. While today’s market may not fully price environmental externalities, forward-thinking policies and market pressures increasingly will.
By implementing carbon accounting and reporting frameworks for AI operations today, companies and governments can establish the measurement infrastructure needed to drive meaningful optimization. These metrics provide essential visibility into not just where energy is being consumed, but how efficiently it’s being utilized across model training, inference, and supporting infrastructure.
I recently worked with a major advertising technology company and the experience illustrated this principle in action. When they migrated from CPUs to GPUs, our measurements revealed a 62% reduction in carbon emissions and 55% less water usage—significant efficiency gains achieved through precise measurement and targeted infrastructure changes. This clearly demonstrated how another path forward is possible. And it starts with tracking and optimizing. Starting this measurement process immediately allows organizations to identify optimization opportunities, establish performance baselines, and develop the organizational capabilities needed for sustainable AI leadership.
According to the Carbon Disclosure Project, “Failure to tackle climate-related risks in supply chains costs nearly three times more than the actions required to mitigate these risks.” Businesses and governments investing in AI infrastructure today must also invest in ways that sustain innovation for generations to come. As incredibly ambitious AI projects take hold, we have no option but to supercharge innovation to support the most efficient models. The race for artificial intelligence supremacy may ultimately transform into a race for energy supremacy, with the most significant breakthroughs coming not from Silicon Valley’s software engineers but from innovations we can make in using AI more sustainably.
In a Senate Commerce Committee hearing with Altman in May, Senator Ted Budd (R-NC) said, “The ability for the U.S. to deploy new energy generation capacity and upgrade its grid is in many ways the key to the race against China. Energy is how we can win and it’s also how we can lose.” According to the International Energy Agency, China is already a global leader in renewable energy, set to account for almost half of the world’s total renewable power by the end of the decade. While the U.S. is hyper-focused on competing with China’s AI models, to effectively compete and lead global AI we also need to lead in operational AI capability. Energy and AI are inextricably linked; this approach must be reflected in our policy.
And while transitioning to renewable energy is crucial for sustainable AI development, scaling renewable energy infrastructure to meet AI’s exponentially growing demands will require decades of coordinated investment and policy alignment. We simply cannot afford to wait that long. To adequately respond to the rush of demand today, we can start to spot and mitigate inefficiencies by first capturing the carbon and water footprint of generative AI.
The Artificial Intelligence Environmental Impacts Act of 2024, out of Massachusetts, is one of the first bills that sets out to align our AI ambitions with our energy realities, calling for the development of comprehensive measurement and reporting tools that take into account the full range of AI’s environmental impact.
Without concerted action now, we risk embedding wasteful practices into the foundations of an economy increasingly powered by AI. We must adopt a policy-forward strategy to incentivize energy-efficient data centers through consistent measurement, tracking and demanding efficiency where it exists.
Last week, Sam Altman laid out his vision for how AI can transform humanity by the 2030s, and he reiterated the possibilities that lie ahead: “[Intelligence and energy] have been the fundamental limiters on human progress for a long time; with abundant intelligence and energy (and good governance), we can theoretically have anything else.”
There’s no doubt that the cost of AI is energy. Let’s make sure it’s sustainable.
Read the full article here