They might be called “small nuclear reactors”, but don’t be fooled: the 500MW Google is buying from Kairos Power is enough to power a midsize city. To begin to understand the scale of the demand AI puts on the electricity grid, keep in mind that this is only enough to cover one datacentre campus equipped to handle the growing demands of AI. One company alone, OpenAI, is trying to get the White House to sign off on building at least five datacentres, needing 5GW each of power – 10 times as big.
The reason for this nuclear power rush: the vast energy consumption of the computer chips (called graphics processing units or GPUs) that power the training of the large language models crucial to the development of AI. Meanwhile, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search.
“GPUs are more advanced and more powerful than the CPUs [central processing units] of the previous generation of datacentres,” Chris Stokel-Walker said. “So there’s more demand there immediately. But we are also starting to see massive ‘megaclusters’ of GPUs. It’s not just the individual chips getting bigger and needing more power: it’s the race to get as many together to amplify their power as possible.”
How much impact will AI’s demand for power have?
“The challenge in estimating this is that the companies are pretty coy about telling us their power usage,” said Chris. “But there is a settled understanding that the energy used by datacentres is going to increase hugely as AI becomes layered into everything we do.”
The increase in demand already is significant: where the average datacentre drew 10MW of power a decade ago, they need 100MW today. And the biggest can already demand more than 600MW each.
The New York-based Uptime Institute, which has created a benchmarking system that is now industry standard, predicts that whereas AI only accounts for 2% of global datacentres’ power use today, that will reach 10% by next year. “The growth in power consumption is not linear,” Chris said. “In the same way that we used to have whacking great transistors behind our TVs and now we have flatscreens with eco-friendly modes, they are getting more efficient. But that doesn’t mean it’s not going upwards – just that it’s going up more slowly.”
How are tech companies trying to get the power to meet their needs?
By building it or paying others to do so. And because most governments expect that control of AI will be crucial to their ability to compete globally in the future, tech firms have a very strong hand when negotiating what to build and where.
“The argument tech companies are making, and that they’re trying to cement in the minds of decision-makers around the world, is: you either buy into this and sign up, or you run the risk of falling behind,” Chris said.
This New York Times piece lays out a case study of how that plays out in practice. It reports that as part of a recent fundraising effort, OpenAI’s CEO, Sam Altman, told executives at a Taiwanese semiconductor company that it would cost about $7tn (£5.6tn) to fulfil his vision of 36 semiconductor plants and additional datacentres. That’s about a quarter of the total US annual economic output. OpenAI denies that claim, and says that its plans run to the hundreds of billions of dollars.
Meanwhile, Altman has also been considering building these centres in other countries, including the United Arab Emirates. But there are fears in Washington that placing the centres there could give China a back door to American AI advances, because of the links between Chinese and Emirati universities. And at the same time, Altman is exploring plans for centres within the US.
“The warning is being used as a stick alongside the carrot,” Chris said. “They’re saying: if you don’t do this, we will go elsewhere, and you will not just lose the investment, but face a national security risk.”
What is the potential impact on the climate?
Big tech companies insist they are leaning into renewable sources of power as much as possible – and argue that AI could ultimately be a crucial tool to limit the damage caused by the climate crisis.
It is true that tech firms’ investment in renewable sources of energy has played an important part in their growth. But claiming that AI will help defeat the climate crisis is a theoretical benefit that won’t be seen until some point in the fairly distant future. And there are claims that emissions caused by current energy usage from datacentres owned by the likes of Google, Microsoft and Meta are much higher than they admit publicly.
In this piece published last month, Isabel O’Brien reported that big tech firms are using renewable energy credits – which may not actually be used to power the datacentres themselves and which may not even reduce emissions – to artificially deflate their reported emissions. That means the actual figures could be more than seven times higher than the numbers they report.
What about the use of nuclear power?
Google says its experiment makes it the first company in the world to buy nuclear energy from small nuclear reactors. But Amazon and Microsoft have already struck deals with conventional, larger nuclear power plants in the US this year. Don’t panic, but Microsoft’s deal will for the first time in five years activate a nuclear reactor at Three Mile Island in Pennsylvania – the site of the worst nuclear meltdown in US history. Sensibly, they’re emphasising its history of safe operation since the 1979 disaster at another reactor there – and renaming it.
With datacentres estimated to be on track to produce about 2.5bn tonnes of carbon-dioxide equivalent emissions by 2030, there is an environmental argument for the use of nuclear power. But that is a highly controversial case, which, because of the associated risks, has been the subject of charged democratic debate for many years. Wherever you stand on that question, it is remarkable that these companies appear to be able to simply decide on their own.
“One of the things that’s really striking here is what it says about how tech companies operate: as supranational organisations that manage to bend countries’ regulation to their will,” Chris said.
On the other hand, Google argues its investment in small nuclear reactors could be a necessary boost to a technology that has struggled to get off the ground. “In the end, some of this does trickle down,” said Chris. “They tend to commercialise technologies in a safe way. But it takes a long time, and the benefits are unequally distributed.”
Can governments bring these changes under control?
There are well-documented issues with regulating tech firms: without globally enforced agreements, there will always be another country ready to offer a better deal. See, for example, Ireland’s status as the European home of many big techfirms because of its favourable tax regime.
Regulation does not necessarily need to be globally agreed to be effective, however: in California, for example, new legislation intended to combat greenwashing will soon require all private companies with global revenue above $1bn to publish details of their carbon footprint. Since any big tech firm is bound to want to maintain operations in California, that could have much wider ramifications.
The impact of attempts at regulation and better data collection on the growth of AI may also depend on whether tech firms willingly cooperate – and if not, whether there is an appetite to force them to. The controversy over renewable energy credits is an example of how vexed even apparently positive steps can be.
And big tech firms have a valuable card in their hand: the desperate desire among governments around the world to win the AI race. “These companies point to astronomical figures of expected improvements in GDP and they say, this is the wave that is coming,” Chris said. “You can either ride it, or drown.”