Consider purchasing a new laptop. You find a model that, although having many more features than your present laptop, uses a lot more electricity. (A tenth time? Thirty times? Since the exact number is a company secret, no salesperson can provide it to you.)
Not to mention, this laptop has a funnel on top. Whenever you ask it to tell you a joke or a funny picture you just thought of, it tells you that it needs to refill its water supply, but it won’t tell you how much. Do you believe the upgrade was worthwhile? Maybe not for those of us who are concerned about an Earth that is getting warmer and thirstier by the day.
However, the end product of our current AI gold rush is that laptop, or something similar. Hold on, is it something similar? Yes, since all we have to go on is hazy estimates. The real cost of carbon dioxide emissions associated with each AI prompt remains unknown, as does the amount of groundwater required to keep thousands of servers cool while processing those prompts. While researchers can sketch a rough picture, companies like Google, Microsoft, OpenAI, and others are always able to offer a more accurate portrait.
However, Sasha Luccioni, a TED talk star, ten-year veteran of AI energy usage research, and current climate lead at Hugging Face, an open-source AI platform, claims that since ChatGPT launched in 2022, “there’s been a general crackdown on information.”
Luccioni says, with a growing annoyance, “I don’t know of any company that offers AI tools that provides information on energy usage and carbon footprint. Even the size of models similar to GPT is unknown. All information is kept confidential and is never shared.”
In other words, AI-hungry, climate-conscious giants like Google and Microsoft have split off. When it comes to your next term paper or AI-painted portrait of Pope in a puffy jacket, they won’t provide you with the same information as they can about how many kilograms of carbon your next flight will emit.
Probably for good reason: if we were aware of the environmental impact of AI products, we would probably start berating each other for using them carelessly.
AI makes us all dirtier
We do know the scope of the issue because tech companies still want to be regarded as responsible environmental citizens. Google disclosed in its 86-page 2024 sustainability report that its overall greenhouse gas emissions increased by 48% between 2019 and 2023, with the majority of that increase occurring since 2022.
That’s not great news, as Google still aims to achieve net zero emissions by 2030. Neither is Microsoft’s sustainability report for 2024, which indicates a 29.1% increase in emissions from 2020.
Both businesses blame outside parties, particularly those who construct their data centers. It is also true that these data centers perform many tasks beyond simply responding to AI commands, which is one of the main reasons why the energy cost of AI is so unclear, as they point out.
However, the AI-assured corporations are also unable to completely ignore the reason behind this unexpected construction boom: data centers that are “designed and optimized to support AI workloads,” to use Microsoft’s terminology.
The Google report acknowledges that we have a long way to go before we reach our 2030 goal. That’s an understatement, considering that data center energy demand is predicted to increase by 160% by 2030. According to a Goldman Sachs estimate from May 2024, data centers’ carbon dioxide emissions could more than double between 2022 and 2030.
To whom should we assign blame for this increase? As stated in this incredibly clever passive-voice sentence from Google’s report: Lowering emissions could be difficult because AI computation is requiring more energy.