Generative artificial intelligence is unavoidable in the online world right now. Every time you conduct a Google search, an AI-generated summary can randomly show up at the top of the results. Alternatively, when using Facebook, you may be asked to try Meta’s AI tool.
Since OpenAI’s groundbreaking release of ChatGPT in late 2022, there has been a rush to integrate AI into as many online interactions as possible. After almost two years, generative AI became the talk of Silicon Valley, and large language model-powered AI technologies are now prevalent in online user experiences.
According to Sajjad Moazeni, a computer engineering researcher at the University of Washington, these back-end techniques that are required for any generative AI model are basically very, very different from the conventional type of email or Google Search. The quantity of data that needed to be sent back and forth between the processors for basic services was rather minimal. On the other hand, Moazeni calculates that generative AI applications require 100–1,000 times more computational power.
Expert after expert last year forecast spikes in energy consumption at data centers where businesses work on AI applications, so the technology’s energy requirements for deployment and training are no longer generative AI’s dirty little secret. Seemingly predictably, Google ceased to regard itself as carbon neutral a while ago, and Microsoft might disregard its environmental objectives in the continuous competition to develop the greatest artificial intelligence capabilities.
Because these data centers are essentially powered proportionately to the amount of calculation they perform, the carbon footprint and energy consumption will be linear to the amount of computation you perform, according to Junchen Jiang, a networked systems researcher at the University of Chicago. The more compute needed, the larger the AI model, and these frontier models are becoming enormous.
A Google spokesperson, Corina Standiford, stated that it would not be accurate to say that the company’s energy usage increased during the AI race, despite the fact that Google’s overall energy use doubled between 2019 and 2023. As their suppliers account for 75% of the carbon footprint, reducing their emissions is quite difficult. The producers of servers, networking hardware, and other technical infrastructure for the data centers—an energy-intensive process necessary to produce the physical components for frontier AI models—are among the providers that Google holds accountable.
Even though data center energy requirements are on the rise, they still represent a small portion of the total energy used by people. Buildings, transit, and oil refineries are currently more influential, according to Fengqi You, a Cornell energy systems engineering expert. “Those industries currently use a lot more energy than AI data centers,” he claims. In light of this, if generative AI tools become more widely used online and are incorporated into more online platforms, AI’s energy footprint may increase in the near future.
Moist Absurd
The data centers that run and train generative AI models need millions of gallons of water in addition to a lot of energy.
There is a relatively small amount of water that humans can use. It’s just the freshwater from the surface and groundwater, Shaolei Ren, coauthor of Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models and a responsible AI researcher at UC Riverside, claims that the data centers are just evaporating water into the atmosphere.
Though initially they might seem similar, the environmental impact of businesses running massive data centers on the neighborhood is not the same as that of locals who might take several bubble baths a week or forget to turn off the faucet while brushing their teeth. Compared to typical residential consumers, they are distinct. According to Ren, they are only removing water when they receive it from the utility and instantly return it to the sewage. They are not actually using the water. This utility’s water is taken by a data center, which then evaporates the water into the atmosphere. It might take a year for the water used by data centers to seep back into the earth, according to him.
In an email, Microsoft’s Alistair Speirs, a senior director of Azure global infrastructure, notes that AI is driving the expansion of data centers and emphasizes that the shift to cloud computing is another important aspect to take into account. When a large portion of the expansion is replacing hardware that was previously operated on-premises, it can appear to be growing pretty quickly, he notes. According to Speirs, Microsoft wants to achieve its targets of zero waste, carbon negative, and water positive by the end of the decade.
Fengqi You, a Cornell researcher, emphasizes the need of continuing the transition to renewable energy sources, while he challenges the efficacy of firms who use carbon offset schemes as part of their sustainability efforts. Offsetting is a temporary solution, which is preferable than nothing, but it is not an ultimate solution, he claims. Ren feels similarly about water replenishment efforts: it’s better than taking no action, but it’s still insufficient. He contends that in addition to direct consumption, firms of a significant size should also consider the water footprint of their supply chains.
Naturally, there are other formidable competitors in the AI space besides Google and Microsoft. Email correspondence from Melanie Roe, a Meta representative, was not answered when she was approached again regarding this subject.
Leaders in Power
Technology corporations frequently present AI development as essential to innovation and as a component of the climate solution, rather than as a threat to the environment. Researchers and developers are investigating creative ways to reduce the energy needed to produce AI tools by relying on more efficient hardware chips in an effort to lessen both the immediate impact and the expense of AI. Additionally, they are experimenting with the possibility of using smaller, less computationally intensive AI models.
These data centers could overload local power grids with their energy requirements, which goes beyond environmental issues. According to Moazeni, there’s a Microsoft data center building in Quincy, Washington. He understands that many people are worried that the power they are burning is essentially consuming all of the energy in that area. The server farms that develop and run AI models may compete globally with nearby businesses and residents for electricity, which could result in blackouts during peak hours.
Microsoft’s vice president of energy, Bobby Hollis, states in an email that the business works with the relevant authorities and utilities to prevent any disruptions to local services. According to him, Microsoft constructs the necessary infrastructure to prevent any decrease in residents’ utility services.
Users that want to be responsible with their energy consumption may find themselves at a loss. Even if you don’t actively seek out generative AI tools, they can be difficult to avoid because they are increasingly included as standard functionality in operating systems, web apps, and ordinary software packages. Whether you’re login into an online job site or simply utilizing the internet to communicate with pals, it’s nearly difficult to click without encountering various chatbots giving information summaries and promises of increased efficiency.
And, while AI currently feels pervasive, it will continue to infiltrate more of our online lives. As it progresses, the upper limits of its energy and water consumption remain unknown.