Artificial intelligence is a notorious energy guzzler. The models that underlie tools like ChatGPT demand huge computational resources, and quietly enabling them are stacks of servers hidden away in remote data centres gobbling up vast quantities of emissions-generating electricity.
Exactly how much energy AI as a whole uses is hard to say, since those data centres handle all sorts of tasks, but to come up with some sense of the total, a researcher named Alex de Vries found a workaround. Because Nvidiaâs servers are used by the great majority of the AI industry, de Vries multiplied the number expected to ship by 2027 by the amount of electricity each one uses. He concluded in an analysis published in 2023 that AI servers could consume about as much electricity as a small country uses in a year.
That tremendous energy consumption and the emissions that come with it at first glance seem like reason enough for any brand or retailer that cares about sustainability to keep AI at a distance. Consumers are on alert about the issue, too. Recently, when the bag brand Baggu was hit with online backlash over a collaboration with Collina Strada that made use of AI-generated designs, one of the major points of criticism from commentators was AIâs environmental impact.
But while AIâs appetite for energy is undeniable, itâs also not exceptional. Essentially everything brands do in the course of business has some impact, and it can be much greater than their use of AI. When PwC studied its own adoption of generative AI, it determined that the related annual emissions would amount to âa fractionâ of those from business travel.
AI can also help brands be more efficient, including by allowing them to better predict demand and avoid overproducing.
So how much should brands really worry about AIâs environmental impact?
Much of the nervousness over AIâs energy use is due to the recent rise of large language models. The process of training these models, which entails ingesting huge amounts of data, has a high impact on its own. Back in 2019, researchers determined that training a large AI model produced about five times the lifetime emissions of the average car. That sounds catastrophic, but perhaps less so when you consider how many more cars there are on the road than AI models being trained every day.
Of course, training is just one part of the equation. The PwC study noted that, for corporate users, the biggest impact will come from actually using these models over time. Thatâs in part because of how they work.
âEvery time you query the model, the whole thing gets activated, so itâs wildly inefficient from a computational perspective,â Sasha Luccioni, a computer scientist at the AI company Hugging Face, told the BBC earlier this year.
Luccioni worked on a study that looked at the energy use of different tasks performed by AI models. It found that, in order to generate 1,000 images, âthe least efficient image generation model uses as much energy as 522 smartphone charges (11.49 kWh), or around half a charge per image generation.â (One caveat was that there was a lot of variation across the image-generation models, depending on the size of the image generated.) Text-based tasks were much more efficient, though they still add up.
While thatâs a significant amount of energy, each individual use isnât so much a problem on its own, and how concerning you deem it may depend on your frame of reference. A separate group of researchers, predominantly from the University of California, compared the carbon emissions of using AI for writing and illustrating tasks versus having humans perform them. The AI systems actually yielded fewer emissions than humans when factoring in how long it takes a human to write a page or create an illustration and the emissions produced by running a computer over that period.
Itâs arguably not a perfect comparison. An AI system could quickly write a page of copy or produce an image, but the results may then require a human to go back and edit. Still, if it saves enough time, the point stands that it could compare favourably with a human working at an electricity-powered computer, probably saving their work to the cloud, which is really just another name for remote servers. (The fact that AI could displace workers and that popular generative models are trained on creative work without consent are separate, if no less serious, matters.)
To that point, it can be easy to forget how much energy we use in our daily working lives. Data centres are significant drivers of the worldâs growing electricity use, and thatâs not just because of AI. They power everything from cloud storage to video calls to internet services, as Ars Technica pointed out in a story about AIâs energy use, and already swallow up huge amounts of electricity to do it.
AI will add to that total, even as its developers try to build more efficient systems. The International Energy Agency predicts the electricity use of data centres, AI and the cryptocurrency industry could double by 2026.
Meanwhile, data centres will be busy powering the rest of the digital world, too, including social media, e-commerce, streaming video, online gaming and more â all of them requiring energy. Ars Technica also noted that the amount of electricity consumed by PC gaming according to one 2018 study was nearly as great as the amount de Vries estimated AI would use in the next few years.
So should brands be concerned about AIâs impact? They absolutely should. But they should also be aware of the impact theyâre having in other ways, too, because the environmental problems around AI arenât totally unique to the technology. Just in running their daily operations theyâre using energy and generating emissions. What matters more than just whether or not a company chooses to use AI is whether itâs even considering and trying to reduce its impact overall.