Search results for

All search results
Best daily deals

Affiliate links on Android Authority may earn us a commission. Learn more.

How much energy does ChatGPT consume? More than you think, but it's not all bad news

AI requires a lot of electricity, but is that really such a bad thing?
By

Published onAugust 4, 2024

Crypto mining with GPU stock image 2
Edgar Cervantes / Android Authority

Everything comes at a cost, and AI is no different. While ChatGPT and Gemini may be free to use, they require a staggering amount of computational power to operate. And if that wasn’t enough, Big Tech is currently engaged in an arms race to build bigger and better models like GPT-5. Critics argue that this growing demand for powerful — and energy-intensive — hardware will have a devastating impact on climate change. So just how much energy does AI like ChatGPT use and what does this electricity use mean from an environmental perspective? Let’s break it down.

ChatGPT energy consumption: How much electricity does AI need?

ChatGPT stock photo 58
Calvin Wankhede / Android Authority

OpenAI’s older GPT-3 large language model required just under 1,300 megawatt hours (MWh) of electricity to train, which is equal to the annual power consumption of about 120 US households. For some context, an average American household consumes just north of 10,000 kilowatt hours each year. That is not all — AI models also need computing power to process each query, which is known as inference. And to achieve that, you need a lot of powerful servers spread across thousands of data centers globally. At the heart of these servers are typically NVIDIA’s H100 chips, which consume 700 watts each and are deployed by the hundreds.

Estimates vary wildly but most researchers agree that ChatGPT alone requires a few hundred MWh every single day. That is enough electricity to power thousands of US households, and maybe even tens of thousands, a year. Given that ChatGPT is no longer the only generative AI player in town, it stands to reason that usage will only grow from here.

AI could use 0.5% of the world's electricity consumption by 2027.

A paper published in 2023 makes an attempt to calculate just how much electricity the generative AI industry will consume within the next few years. Its author, Alex de Vries, estimates that market leader NVIDIA will ship as many as 1.5 million AI server units by 2027. That would result in AI servers utilizing 85.4 to 134 terawatt hours (TWh) of electricity each year, more than the annual power consumption of smaller countries like the Netherlands, Bangladesh, and Sweden.

While these are certainly alarmingly high figures, it’s worth noting that the total worldwide electricity production was nearly 29,000 TWh just a couple of years ago. In other words, AI servers would account for roughly half a percent of the world’s energy consumption by 2027. Is that still a lot? Yes, but it needs to be judged with some context.

The case for AI’s electricity consumption

AI may consume enough electricity to equal the output of smaller nations, but it is not the only industry to do so. As a matter of fact, data centers that power the rest of the internet consume way more than those dedicated to AI and demand on that front has been growing regardless of new releases like ChatGPT. According to the International Energy Agency, all of the world’s data centers consume 460 TWh today. However, the trendline has been increasing sharply since the Great Recession ended in 2009 — AI had no part to play in this until late 2022.

Even if we consider the researcher’s worst case scenario from above and assume that AI servers will account for 134 TWh of electricity, it will pale in comparison to the world’s overall data center consumption. Netflix alone used enough electricity to power 40,000 US households in 2019, and that number has certainly increased since then, but you don’t see anyone clamoring to end internet streaming as a whole. Air conditioners account for a whopping 10% of global electricity consumption, or 20x as much as AI’s worst 2027 consumption estimate.

AI's electricity usage pales in comparison to that of global data centers as a whole.

AI’s electricity consumption can also be compared with the controversy surrounding Bitcoin’s energy usage. Much like AI, Bitcoin faced severe criticism for its high electricity consumption, with many labeling it a serious environmental threat. Yet, the financial incentives of mining have driven its adoption in regions with cheaper and renewable energy sources. This is only possible because of the abundance of electricity in such regions, where it might otherwise be underutilized or even wasted. All of this means that we should really be asking about the carbon footprint of AI, and not just focus on the raw electricity consumption figures.

The good news is that like cryptocurrency mining operations, data centers are often strategically built in regions where electricity is either abundant or cheaper to produce. This is why renting a server in Singapore is significantly cheaper than in Chicago.

Google aims to run all of its data centers on 24/7 carbon-free energy by 2030. And according to the company’s 2024 environmental report, 64% of its data centers’ electricity usage already comes from carbon-free energy sources. Microsoft has set a similar target and its Azure data centers power ChatGPT.

Increasing efficiency: Could AI’s electricity demand plateau?

Samsung Galaxy S24 GalaxyAI Transcription Summary
Robert Triggs / Android Authority

As generative AI technology continues to evolve, companies have also been developing smaller and more efficient models. Ever since ChatGPT’s release in late 2022, we’ve seen a slew of models that prioritize efficiency without sacrificing performance. Some of these newer AI models can deliver results comparable to those of their larger predecessors from just a few months ago.

For instance, OpenAI’s recent GPT-4o mini is significantly cheaper than the GPT-3 Turbo it replaces. The company hasn’t divulged efficiency numbers, but the order-of-magnitude reduction in API costs indicates a big reduction in compute costs (and thus, electricity consumption).

We have also seen a push for on-device processing for tasks like summarization and translation that can be achieved by smaller models. While you could argue that the inclusion of new software suites like Galaxy AI still results in increased power consumption on the device itself, the trade-off can be offset by the productivity gains it enables. I, for one, would gladly trade slightly worse battery life for the ability to get real-time translation anywhere in the world. The sheer convenience can make the modest increase in energy consumption worthwhile for many others.

Still, not everyone views AI as a necessary or beneficial development. For some, any additional energy usage is seen as unnecessary or wasteful, and no amount of increased efficiency can change that. Only time will tell if AI is a necessary evil, similar to many other technologies in our lives, or if it’s simply a waste of electricity.

You might like