A few major worries, such as whether superintelligent AI would exterminate humanity or whether AI will take our jobs, have dominated mainstream discussions about artificial intelligence (AI). Although they may be just as significant, we’ve paid less attention to the myriad other environmental and societal effects of our “consumption” of AI.
The “externalities” – the unintended consequences of our consumption — that are connected to everything we consume. An example of an externality that negatively affects both people and the environment is industrial pollution.
Although there seems to be considerably less public awareness of them, the online services we use on a daily basis also have externalities. These aspects cannot be ignored given the enormous growth in the use of AI.
Effects of AI use on the environment
According to a 2019 estimate by the French think tank The Shift Project, the usage of digital technology generates more carbon emissions than the aviation sector. Additionally, the market for AI is expected to increase ninefold by 2030, despite the fact that it is now anticipated to produce less than 1% of all carbon emissions.
Large language models (LLMs), which are sophisticated computational systems, are the foundation of tools like ChatGPT. Although we have access to these models online, they are really run and trained in physical data centers that use a lot of resources.
An analysis of the carbon impact of Hugging Face’s own LLM, BLOOM, which is comparable in complexity to OpenAI’s GPT-3, was released last year.
The creation and use of the model resulted in the equivalent of 60 flights from New York to London after taking into account the effects of raw material extraction, production, training, deployment, and end-of-life disposal.
Hugging Face calculated that GPT-3’s life cycle would produce ten times more emissions due to the fact that the data centers that power it use a more carbon-intensive grid. Without accounting for the effects of GTP-3’s raw materials, manufacture, and disposal.
GPT-4, the newest LLM solution from OpenAI, is said to feature trillions of parameters and utilize potentially enormous amounts of energy.
In addition, operating AI models consumes a significant amount of water. Water towers are used in data centers to cool the local servers where AI models are developed and implemented. According to the country’s Ministry of Environment (although the Minister for Industry has disputed the estimates), Google recently came under fire for plans to establish a new data center in drought-stricken Uruguay that would consume 7.6 million litres of water daily to cool its servers. In order to provide the power needed to run data centers, water is also required.
A method for calculating the water footprint of AI models was provided by Pengfei Li and colleagues in a preprint that was published this year. They took this action in reaction to the lack of openness in how businesses assess the water footprint linked to the use and training of AI.
They estimate that producing between 300 and 1,000 cars’ worth of water (or between 210,000 and 700,000 litres) was needed to train GPT-3. ChatGPT is predicted to “drink” the same amount of water as a 500 milli litre bottle during a chat with 20 to 50 queries.
Social effects of using AI
During the training stage, LLMs frequently require a lot of human input. This is frequently outsourced to independent contractors that deal with unstable working circumstances in low-income nations, which has sparked charges of “digital sweatshops.”
According to a Time article from January, Kenyan employees hired to identify text data for ChatGPT’s “toxicity” detection were paid less than $2 per hour despite being exposed to traumatic and explicit material.
LLMs can also be used to produce propaganda and false news. AI has the ability to be exploited to manipulate public opinion, which indirectly threatens democratic processes. In a recent study, Stanford University researchers discovered that AI-generated messages were consistently compelling to human readers on timely concerns like imposing carbon levies and outlawing assault weapons.
Some people won’t be able to adjust to the AI surge. Global wealth disparity could get worse as a result of the widespread use of AI. The job market will be significantly disrupted, but it could also disproportionately marginalize employees from particular backgrounds and in particular industries.
Are there solutions?
Numerous elements will affect how AI affects humanity in the future. Though it’s hard to predict, future generative AI models may be built to consume substantially less energy.
When it comes to data centers, factors like their physical location, the kind of power source they utilize, and the time of day they are used can have a big impact on how much energy and water they use overall. Significant savings might be achieved by optimizing these computing resources. Hugging Face, Google, and Microsoft are just a few of the businesses that have promoted the use of AI and cloud services in resource management.
We must all be aware that every chatbot enquiry and image generation results in the usage of water and energy and may have an impact on human labor as we are all either direct or indirect users of AI services.
A future wave of sustainability standards and certifications may be sparked by AI’s rising popularity. Users will be able to choose certified AI services after using these to compare and comprehend the effects of various AI services. This is comparable to the Climate Neutral Data Centre Pact, which calls for European data center operators to make their facilities climate neutral by 2030.
Furthermore, governments will be involved. A draught law to reduce the hazards associated with AI use has been passed by the European Parliament. Additionally, the US Senate earlier this year received testimony from a variety of specialists on how AI’s negative effects should be reduced and its regulation made more effective. China has also released regulations on the usage of generative AI that mandate security audits for products providing public services.