The amount of electricity needed to power the world's data centres is expected to double in the next five years, according to an International Energy Agency (IEA) report.
Most of the increase is being driven by the development and use of increasingly powerful AI models.
By 2030, data centres could consume 945TWh (terawatt hours) of electricity each year, the report found - three times more than the entire UK.
The rise in demand, predicted to be highly concentrated around the world's tech and population hubs, will put pressure on utility companies, grid infrastructure and the planet.
"AI is one of the biggest stories in the energy world today," says Fatih Birol, executive director of the IEA.
"In the United States, data centres are on course to account for almost half of the growth in electricity demand; in Japan, more than half; and in Malaysia, as much as one-fifth."
Big tech firms have been reluctant to give details about how much energy they use to train and operate their AI models.
But the report estimates that training GPT-4, one of the more powerful large language models developed by OpenAI, took 14 weeks and used around 42GWh (gigawatt hours) of electrical power.
That's the equivalent to the daily electricity consumption of around 28,500 households in the developed world or 70,500 households in poorer countries.
And using an AI to perform a task - known as inference - can be very energy intensive too.
To generate a six second video clip, an AI model requires eight times more electricity than it takes to charge a mobile phone, or nearly twice that needed to charge a laptop, the report concludes.
In the US, data centres, largely being built to train and operate AI, are expected to consume more electricity by 2030 than the manufacturing of all the nation's energy-intensive goods including aluminium, steel, cement and chemicals, a report from the IEA found.
But the agency also predicts that AI will be an essential tool in informing how to manage future energy demand, engineer more efficient data centres and accelerate the development of new, cleaner sources of electricity generation.
Two main shifts have driven the AI revolution and its incredible demand for power.
The cost of "compute" - the processors and associated servers to build data centres - has fallen by 99% since 2006.
Whereas the amount of compute being used to train and run state-of-the-art AI models has increased by a mind-boggling 350,000-fold in just a decade.
Read more science and tech news:
'Concerning' levels of E.coli found in Thames
Watchdog to investigate 'suicide forum'
Wolf extinct for 10,000 years brought back to life
Energy demand could outstrip supply
Depending on the energy sources used, AI development could drive up carbon emissions and water consumption needed for cooling servers.
American tech firms are already struggling to find enough power to run their growing data centre needs, as well as the computing hardware needed to run them.
A survey by Reuters of 13 major US power providers found nearly half have received requests from data companies for power that would exceed their current peak demand.
It's one of the key uncertainties in the IEA report.
Unlikely to risk blackouts to meet AI energy demand, countries aggressively pursuing AI development will need to build far more electricity generation.
It's not clear how quickly that might happen and also how quickly the energy efficiency of data centres and the AI models they are running improves.
One of the greatest uncertainties are Donald Trump's tariffs, introduced after the report was completed.
Yet the US president's attack on the global trade status quo could directly and significantly impact data centre and AI development in the US and beyond.
High tariffs on China are predicted to choke off supplies of raw materials needed to build new energy infrastructure.
In particular, those for low-carbon energy sources like solar panels, wind turbine motors and batteries to store renewable electricity.
Demand for low-carbon generation was surging in the US before Mr Trump's election - a large chunk of that coming from tech companies wanting to power data centres.
The US president has promised to boost US coal production to power AI, but it's far from certain if power companies will choose to build new plants given their high cost relative to some low-carbon alternatives.
The time they take to build could also mean electricity supply will lag well behind the IEA's forecast for data centre electricity demand.
China, on the other hand, already gaining fast on the US in terms of AI development, may find low-carbon electricity gets cheaper and quicker to build if its clean energy exports to the US dry up due to tariffs.
(c) Sky News 2025: Amount of electricity needed to power world's data centres expected to double in five years