Climate change is the “defining issue of our generation,” according to Microsoft CEO Satya Nadella, who also stated that artificial intelligence “can be a powerful accelerant in addressing the climate crisis” in his annual shareholder letter from the previous year.
Similar to Nadella, CEO of Nvidia (NVDA) Jensen Huang has stated that artificial intelligence (AI) will advance the field of climate science.
Additionally, Sam Altman, the CEO of OpenAI, one of the most well-known figures in the industry and a former venture capitalist, has stated that artificial intelligence (AI) has the potential to help humanity in many ways, including “curing cancer and addressing climate change.”
The true nature of the connection between AI and climate change is far more complex, notwithstanding these optimistic viewpoints.
The size and energy usage of AI models have a role in that relationship.
The technology that powers ChatGPT, known as Large Language Models (LLMs), is well known for consuming a significant amount of processing power compared to other conventional web search engines. Therefore, running such models comes with a huge environmental cost in terms of electricity and water. And given the kind of energy sources used to power a particular electrical grid and the locations of the data centers housing these models, the carbon footprint of all that electricity use is probably rather high (though it is still mostly unknown).
For instance, just during Llama 3’s training, Meta released 2,290 tons of carbon dioxide. An average gas-powered car emits one ton of carbon dioxide for every 2,500 miles traveled, according to data from the U.S. Environmental Protection Agency (EPA).
To match that data point, an average American would need to drive 13,489 miles annually, or around 420 years.
Regarding the electricity and carbon impact of their AI models, Google and OpenAI did not reply to requests for comment.
With a goal of achieving net-zero carbon emissions by 2030 (with the help of carbon offsets), Google released 10.2 million metric tons of carbon dioxide equivalent in 2022. By contrast, Finland, which has a population of about 5.5 million, released 45.8 million metric tons of CO2 equivalent in that same year.
A 20% increase from the previous year, Google used 5.6 billion gallons of water in 2022. That was mostly used by its data centers. As per the company’s environmental report, it only refilled 6% of the water it consumed in 2022, despite its commitment to replenish 120% by 2030.
The University of California, Riverside’s Shaolei Ren, an associate professor of electrical and computer engineering, told that Google’s replenishment aim merely “makes their water accounting look nicer,” emphasizing that “the water is still consumed.”
Remarks were declined by Anthropic.
In response, Microsoft stated that it is “investing in research to measure the energy use and carbon impact of AI while working on ways to make large systems more efficient, in both training and application.” The company declined to share information on the energy consumption or carbon footprint of its AI models.
Microsoft’s vice president of energy, Bobby Hollis, stated that the business is “making more investments in renewable energy procurement and other initiatives to achieve our sustainability targets of being carbon negative, water positive, and waste neutral by 2030.”
According to their environmental report, Microsoft released over 13 million metric tons of carbon dioxide equivalent in 2022. The company’s expenditures in AI, according to Ren, are what caused its global water usage to jump 34% to 1.7 billion gallons in 2022.
Data centers in the United States used 200 Terawatt hours of electricity in 2022, or 4% of the nation’s total electricity consumption, according to a recent report by the International Energy Agency (IEA).
The IEA anticipates that figure to keep rising.
The industry is aware of AI’s enormous energy requirements, and Altman himself stated in January that a new energy “breakthrough” will soon be required to fuel AI.
Sasha Luccioni, a researcher on AI and sustainability, responded at the time, saying that this was so indicative of the broken relationship between AI and the environment. There’s no magic way for us to produce additional energy. Right now, we must stop putting genAI into everything and cut down on its energy use.
The legitimate applicability of certain technologies is the second aspect of the interaction between AI and climate change. Technology that can lessen the effects of climate change is not what Altman’s OpenAI has made it so famous for. Generative AI models, such as ChatGPT, are intended to improve business productivity by producing artificially generated text in real time.
Gary Marcus, an AI researcher and cognitive scientist, claims that these LLMs won’t change material science or and save the climate.
He believes that efficiency will be the main advantage of the new system, as he told the New York Times last year. He can be more productive because he doesn’t have to type as much. These instruments have the potential to greatly increase production while tearing apart society.
However, this does not imply that artificial intelligence cannot be used to lessen the effects of climate change. Large volumes of data are already being analyzed using various, more finely tuned versions of AI and machine learning, which provide pattern recognition and, ultimately, lead to significant action.
Artificial Intelligence’s promise for combating climate change
Climate Change member and geospatial machine learning researcher Konstantin Klemmer explains the primary distinction between AI and machine learning. The issue of size and specificity is relevant to AI, a worldwide NGO operating at the nexus of AI and climate change. In contrast to machine learning, which typically refers to smaller models that are tailored to a particular task, Klemmer told that the word “AI” is frequently used to denote very large, general-purpose models.
The smaller the model, the lesser the electricity demand.
Klemmer added that considering the trade-offs is “critical” when integrating AI or machine learning with climate change.
The environmental cost of developing and running a very large model may be justified if it can later be used to have an outsized positive climate impact, he added. He believes that what matters is that we approach these large models with a more systematic and formalized cost-benefit analysis. How this trade-off operates, however, is highly ambiguous.
Rather, Klemmer continued, “what we’re seeing is just pushing forward on these models irrespective of their cost-benefit analysis.”
He asserted that LLMs are not created with the climate in mind. Klemmer says that the “far-fetched” claim that language models (LLMs) are the first step toward artificial general intelligence (AGI), a hypothetical AI system with intelligence comparable to or greater than that of humans, is the basis for the perception that these models could actually help humanity deal with and mitigate climate change. This is the aim of OpenAI.
Klemmer is among the many scholars who doubt the possibility of achieving AGI in the near future. Some, such as Marcus, have asserted that LLMs are not a means to achieve AGI.
However, LLMs are not the only kind of AI available. According to Klemmer, as geospatial models become “bigger and bigger,” they help scientists create better climate models, which in turn help them comprehend the globe better. According to him, this is an illustration of an emission-intensive project that is probably “worth emissions,” as these models allow for the adaptation to climate change.
The key word is ‘efficiency’
Klemmer sees efficiency as the key to AI’s potential to mitigate climate change, albeit not the same efficiency Marcus mentioned in his email message.
According to Klemmer, machine learning is similar to optimization. There are numerous high-emission systems on our planet that could be made more efficient by machine learning initiatives.
AI-driven optimization can significantly lower the carbon footprint of homes, buildings, and inefficient electricity networks, according to Klemmer. As a matter of fact, in 2022 the National Renewable Energy Laboratory discovered that the mere act of air conditioning contributed almost 4% of worldwide emissions.
AGI is not the primary link between climate mitigation and machine learning, according to Klemmer. Frequently, it involves large-scale energy optimization and straightforward pattern analysis. Right now, his greatest wish is that there were more incentives for businesses to investigate and use this kind of technology.
Even while the largest technology companies in the world are devoting an increasing amount of resources to developing and implementing AI, many of these same organizations are making (mostly modest) attempts to use AI for purposes other than email writing and image production.
On its website, Google, for instance, summarizes the goal of its “AI for Social Good” project as follows: “What if people could predict natural disasters before they happen?” Track disease as it spreads in order to get rid of it earlier? Although it’s not a miracle, AI can be helpful.
As an illustration, Google’s “AI for Social Good” project aims to forecast natural disasters before they occur. This is summed up on the company’s website. Keep an eye on the disease’s spread to get rid of it sooner? Though not a miracle cure, AI can be helpful.
Google utilizes artificial intelligence (AI) algorithms to forecast the location and timing of floods, allowing people to evacuate in advance of the disaster. This is one of its climate-focused projects.
The timing of traffic lights is optimized in another to minimize emissions from vehicles in motion.
Microsoft also operates a “AI for Good” research facility, with a mission that closely mirrors Google’s own initiatives in the field. One of its climate-related projects uses artificial intelligence (AI) to map and closely monitor the Amazon rain forest, giving researchers more information to better avoid deforestation. Another allows for the monitoring and tracking of renewable energy sources around the world.
IBM stated last year that its emphasis on enterprise AI, as opposed to chatbots aimed at consumers, sets it apart from other well-known tech titans. However, IBM has been investigating AI’s sustainable uses, much like Google and Microsoft have.
In an interview with Christina Shim, Global Head of Product Management and Strategy, IBM for Sustainability Software, and vice president and Chief Impact Officer Justina Nixon-Saintil to talk about the company’s use of AI for climate-positive purposes and the business feasibility of sustainability.
IBM for the environment
After making its geospatial AI model publicly available last year, IBM has now revealed a number of collaborations and projects with a sustainability focus.
In one of these initiatives, scientists from the Mohamed bin Zayed University of Artificial Intelligence are mapping and reducing urban heat islands in Abu Dhabi by utilizing IBM’s geospatial model. By the end of December, the initiative had reduced the heat island’s impact by more than 5.4 degrees Fahrenheit.
In order to accelerate its forestry efforts, the company has also cooperated with the Kenyan government. Additionally, it has mapped urban areas and identified spots where trees might be planted to lower future possible flood hazards with groups throughout the United Kingdom.
These alliances developed in tandem with IBM’s Sustainability Accelerator, a pro for free initiative the corporation introduced in 2022.
Through the Accelerator, IBM has also collaborated extensively with American farmers, utilizing AI models to provide small farmers with more information regarding water use. This initiative has resulted in increased crop yields, especially in arid states like Texas.
IBM just revealed that it will be contributing $45 million over the course of the following five years to the Accelerator. In the fourth quarter of 2023, IBM posted a $3.29 billion net income on $17.38 billion in revenue.
IBM tracks energy consumption and improves energy efficiency internally using its own technologies, yet a recent environmental report states that the business consumed around 2.3 million megawatt hours of energy in 2023. However, IBM claimed that renewable electricity accounted for 70.6% of the energy usage, resulting in 364,000 metric tons of carbon dioxide emissions annually.
By 2030, IBM wants 90% of its electricity to come from renewable sources, and by then, it wants to be carbon neutral.
Nixon-Saintil asserted that sustainability is ingrained in our society. Our involvement in it has been substantial for a while, and we play a leadership role in it.
Furthermore, Shim clarified that purpose and profit don’t always have to conflict with one another.
Yet, IBM is working with the oil, gas, and chemical industries at the same time that it is promoting nonprofit sustainability projects. It offers its technology to help lower operating costs, introduce new products, open up new markets, pursue mergers and acquisitions, and explore new avenues for value creation.
An element of these services entails enhanced energy monitoring for environmental reporting and releases.
The sustainable business
A study of 5,000 worldwide C-suite executives was performed by IBM in February. The survey revealed that 75% of the participants felt that sustainability “drives better business results,” while roughly 70% felt that sustainability activities should be given more attention.
Shim mentioned the survey results and remarked, “This is a boardroom discussion that needs to happen.” As more information about it becomes available, the business case becomes stronger and stronger.
Shim believes that businesses are seeing the potential of AI as a sustainable tool. We’re seeing it with many of these partner examples—there are methods for them to maximize benefits and reduce environmental effect in the process.
IBM provides a range of eco-friendly solutions as part of its AI-enabled enterprise product stack. With the goal of improving data flows to better understand (and ultimately reduce) energy usage, these technologies are all concentrated on raising operational efficiency within businesses.
The balance
However, given the high environmental cost of utilizing such technology, the potential of AI for climate change is dependent on cost-benefit studies.
According to Luccioni, who spoke last year, there is a trend toward one-size-fits-all solutions right now. Everybody is attempting to plug in LLMs and see what sticks, therefore it’s possible that we’ll see increased compute and energy use because everything now needs an LLM because it’s trendy.
After questioned Nixon-Saintil and Shim about how to balance the pursuit of sustainable AI use cases with the goal of advancing generative AI for corporate productivity. They said that IBM has invested a lot of research and resources into foundation models, which only need to be trained once before they can be adjusted and used in a variety of contexts. Neither gave information on the energy use or carbon footprint of IBM’s models.
According to Shim, there are methods for maximizing AI’s advantages while reducing its negative effects on the environment.