Can clean energy handle the AI boom?
By Vox
Summary
Topics Covered
- Data Centers Double Global Electricity Use
- AI Training Devours Household Electricity
- AI Disrupts Tech's Net Zero Pledges
- Data Centers Steal Renewables from Others
Full Transcript
I spent some time recently reading through a big spreadsheet of questions submitted by Vox's audience members.
And one of them caught my eye.
It was from Kathy, a retired schoolteacher in New York City.
what is the question that you wanted us to answer?
so the question is, can green energy even begin to handle the increased demands that I.
And, crypto and cloud storage are going to put on our energy system.
It's a good question.
I've done some reporting on AI But I've never thought much of the climate impact of all the AI products we're increasingly using.
or all of our digital belongings, like photos and documents and emails getting stockpiled in servers around the world.
they need a lot of electricity and the electricity has to come from somewhere, This is all happening while the climate crisis demands we use less energy, not invent new ways to use more of it.
I think our climate goals already feel pretty impossible to me, but now it's almost like we haven't changed the goalposts.
We've changed the entire game.
So let's get to the bottom of this.
Within Kathy's big question is a more basic one about how much electricity our digital lives require.
At first I was thrown off that Kathy mentioned things like cloud storage and AI and cryptocurrency in one category.
But then I realized that their electricity demands happen at the same place.
Data centers.
Ultimately, you're talking about machines loaded up in large facilities that generate computations.
And they need power. A lot of it.
They need water.
They need space.
I spoke to Alex de Vries.
He runs a research site called Digiconomist where he's dug into this exact topic.
Data centers are massive, often windowless warehouses that house thousands of servers that run virtually nonstop.
Some of the bigger data centers are as big as four football fields, and use as much electricity at any given time as 80,000 households.
There are more than 8000 data centers around the world, and the US has more than any other country.
In 2022, data centers, artificial intelligence and cryptocurrencies made up about 2% of total global electricity demand.
But by 2026, that number is expected to double, which is like adding the amount of electricity used by the entire country of Sweden.
I'll explain why in a minute.
All right, Support for this video comes from Klaviyo.
Klaviyo works with businesses to turn their data into meaningful connections with their customers through AI powered email, text messages, and more.
According to Klaviyo, over 150,000 brands trust their data and marketing platform to build smarter digital relationships with their customers.
During the holiday season and beyond.
Klaviyo has no editorial influence over our work, but they make videos like this possible.
Learn more at the link below.
That big jump from 2022 to 2026 is thanks to rising cloud storage and cryptocurrency electricity demands, but it's also because of the AI boom.
we know AI requires a ton of computational power, but it turns out that really difficult question to answer.
AI is a huge umbrella term that includes everything from basic statistical models that detect patterns in data, to generative AI that creates text and images and videos.
That's the most computationally intensive kind.
The thing is, the handful of private tech companies that dominate the, I feel, don't really disclose how much of their energy use is dedicated to AI.
Specifically.
If you look at Google's latest environmental report, it clearly states they absolutely don't want to make a distinction between regular workloads and AI specific workloads.
In these companies, AI models are mostly closed source, meaning no one knows exactly how they are built.
This has left some researchers to try to piece it together on their own.
researchers looked at an open source large language model called bloom that has roughly the same amount of parameters as GPT three, and found that training something like GPT three required almost 1300 megawatt hours of electricity.
About as much power as consumed by 130 homes in the US for one year.
Today, large language models like GPT four have hundreds of billions of parameters, if not a trillion.
And researchers say that the computational power required to train these models is expected to double every nine months.
so far it has mostly been large language models driving the, AI energy boom.
Of course that could change going forward.
Now we see AI on the rise for image generation and also specifically video generation, So far, we talked about training a large language model.
Researchers also looked at energy use from people actually using it.
It's been estimated, by myself and others, that a single GPT interaction would take like three watt hours, which is comparable to running a low lumen LED bulb for one hour or so on itself.
It doesn't sound like a whole lot, but of course, hey, it's the volume that matters the system times more than the standard Google search.
And of course, if you're talking about millions or billions of interactions, the numbers start to stack up quickly.
Alex took another research approach by looking at the hardware used for eye training and use.
Over 95% of the AI industry uses servers made by the company Nvidia.
they could sell 1.5 million of their servers by 2027.
He multiplied that by the information Nvidia publicizes about each of their servers energy demand.
He found that data centers devoted to AI alone could consume around 100 terawatt hours of electricity per year, or about the same as his home country of the Netherlands.
There's a big part of Cathy's question I haven't gotten to yet.
from the world's data centers?
The good news is that using green energy is the stated goal of a lot of these companies.
Both Google and Microsoft have made pledges to be net zero by 2030, but there are signs that AI is disrupting those plans.
That's because solar and wind energy can't produce electricity all of the time, and these data centers need to be running all of the time.
most cases, they will just have a backup connection to the power grid, which will, have fossil fuels on it.
It's not just that data centers are being built at a rate that renewable energy infrastructure can't keep up with.
It can take a year to build a data center, but many more years to get a solar or wind farm on an electrical grid.
Google's 2024 sustainability report showed that the company's emissions rose by 48% from 2019 to 2023, in large part due to its data center energy consumption, suggesting that integrating AI into their products could make reducing their emissions challenging.
There's already evidence in the US that coal plants that were meant to close are staying open because of data centers, electricity demands, and that state utilities are building new natural gas plants for the same reason.
But even if these tech companies can look good on their sustainability reports and get to net zero, there's still a problem.
the thing is that our renewable energy supply globally is limited.
So if we, attributing an increasing part of that to the data center industry, the consequence is that there's less renewables available for everything else.
that probably will mean that on the whole, we will end up using more fossil fuels anyway, With all this context, the answer to Cathy's question is that for right now, we aren't prepared for renewable energy to meet the increasing demand of the world's data centers.
about this?
as users.
It would be extremely difficult to opt out of backing up our data on the cloud, or even refrain from using AI.
think AI is embedded in so many things that I'm not sure I will have the option to say I'm not using it, you know, I'm out.
researchers like Alex, say the best place to start is to force more transparency from these tech companies.
in the EU that I asked doesn't really for us to companies to disclose anything with regard to the environment.
And that's the EU not even talking about the U.S, yet,
which is lagging behind a bit on this matter.
Some environmental organizations and local communities are calling for moratoriums on data centers, And some researchers have proposed the idea of an energy efficiency rating so companies and consumers can choose data centers that are the most sustainable.
We could also hope that the servers and data centers will keep getting more energy efficient.
But more than anything, this issue emphasizes how desperately we need to be scaling up renewable energy and fast, not only to meet the ever increasing data center demands, but so there's plenty of renewable energy to go around.
If you liked this video, you'll love Vox's new podcast.
Explain it to me.
I'm the host, JQ, and every week I call up a listener with a question get them some answers and we have some fun along the way.
You can find a link to the podcast in the description while you're there.
If you want to support Vox.
You can check out the details on our membership program.
Loading video analysis...