Artificial intelligence is a notorious energy consumer. The models that underpin tools like ChatGPT require enormous computing resources, and to make them available discreetly, stacks of servers hidden in remote data centers gobble up vast amounts of emissions-generating electricity.
It’s hard to pinpoint exactly how much energy AI consumes as a whole, because these data centers handle all sorts of tasks. But to get a sense of the total amount, a researcher named Alex de Vries came up with a workaround. Since Nvidia’s servers are used by the vast majority of the AI industry, de Vries multiplied the number of servers expected to ship by 2027 by the amount of electricity each one consumes. He concluded in an analysis published in 2023 that AI servers could consume roughly as much electricity as a small country in a year.
At first glance, this huge energy consumption and the emissions that come with it seem like reason enough for any sustainability-minded brand or retailer to steer clear of AI. Consumers are paying attention to this issue as well. Recently, when bag brand Baggu was hit by negative online reaction Regarding a collaboration with Collina Strada that used AI-generated designs, one of the main points of criticism from commentators was the environmental impact of AI.
But while AI’s appetite for energy is undeniable, it’s not unique either. Essentially, everything brands do in their business has an impact, and it can be much bigger than their use of AI. PwC studied its own adoption of generative AIIt determined that the corresponding annual emissions would represent “a fraction” of those from business travel.
AI can also help brands be more efficient, particularly by enabling them to better predict demand and avoid overproduction.
So how concerned should brands really be about the environmental impact of AI?
Much of the nervousness about AI’s energy consumption is due to the recent rise of large language models. The process of training these models, which involves ingesting huge amounts of data, has a significant impact in itself. In 2019, The researchers determined Training a large AI model produced about five times more emissions than the lifetime of an average car. That sounds catastrophic, but perhaps less so when you consider that the number of cars on the road outnumbers the number of AI models trained each day.
Of course, training is only part of the equation. PwC’s study suggests that for business users, the biggest impact will come from actually using these models over time. Part of the reason for this is how they work.
“Every time you query the model, the whole system is activated, which is extremely computationally inefficient,” says Sasha Luccioni, a computer scientist at Hugging Face, an AI company. told the BBC earlier this year.
Luccioni worked on a study that examined energy consumption of different tasks performed by AI models. The study found that to generate 1,000 images, “the least efficient image generation model consumes as much energy as 522 smartphone charges (11.49 kWh), or about half as much as one charge per image generation.” (The only downside was that there was a lot of variation between image generation models, depending on the size of the image being generated.) Text-based tasks were much more efficient, although they still add up.
While that’s a significant amount of energy, each individual use isn’t really a problem in and of itself, and how concerned you are may depend on your frame of reference. A separate group of researchers, primarily from the University of California, compared the carbon emissions of using AI for writing and illustration tasks versus those performed by humans. actually produced fewer emissions than humans if you factor in the time it takes a human to write a page or create an illustration and the emissions produced by running a computer during that time.
The comparison is probably not perfect. An AI system could quickly write a page of text or produce an image, but the results might then require a human to go back and edit them. Still, if it saves enough time, it could still compare favorably to a human working on an electrically powered computer, presumably saving their work in the cloud, which is really just another name for remote servers. (The fact that AI could replace workers and that popular generative models could be trained to do creative work without consent are separate, if no less serious, issues.)
At this point, it can be easy to forget how much energy we use in our daily working lives. Data centers are major drivers of the growth in electricity consumption around the world, and it’s not just because of AI. They power everything from cloud storage to video calls to internet services, as well as Ars Technica pointed out in a story about AI energy consumption, and it’s already consuming huge amounts of electricity to do so.
Artificial intelligence will further aggravate this figure, even as its developers try to create more efficient systems. The International Energy Agency predicts that electricity consumption from data centers, artificial intelligence, and the cryptocurrency sector could double by 2026.
Meanwhile, data centers will also be busy powering the rest of the digital world, including social media, e-commerce, video streaming, online gaming, and more, all of which require energy. Ars Technica also noted that the amount of electricity consumed by PC gaming, according to a 2018 study, was nearly as much as the amount de Vries estimated AI would use in the coming years.
So should brands be concerned about the impact of AI? Absolutely. But they also need to be aware of the impact they’re having in other ways, because the environmental issues surrounding AI aren’t unique to AI alone. They’re consuming energy and generating emissions just in their day-to-day operations. More important than whether or not a company chooses to use AI is whether it’s considering and trying to reduce its overall impact.