The success of artificial intelligence is causing the carbon footprint of industry players to soar

2024-07-04 03:00:00

The widespread adoption of this technology, which relies on ultra-energy-hungry graphics processing units (GPUs), is not unrelated to the explosion of the carbon footprint of Google and other Gafam, with Microsoft’s emissions also increasing by 29% in 2023 compared to 2020 according to AFP. In its report, Google confirms that as AI is integrated into its products, “Reducing emissions may prove difficult“The Californian multinational cites increased energy needs, as AI requires more computing power.

On average, a query to ChatGPT would in fact consume ten times more electricity than on a classic search engine.

TO READ ALSO

[Succès et déboires de l’IA] ChatGPT, a stunning tool thanks to learning boosted… by humans

Google also mentions emissions related to its investments in infrastructure, i.e. building new data centers or modernizing existing ones.

Currently, AI represents 3% of the consumption of these data centers, a figure that should rise to 10% by 2030, estimates the server designer and manufacturer 2CRSI. According to the white paper on generative AI by the Data for Good collective, the manufacture of a server dedicated to training AI models emits approximately 3.7 tonnes of CO2 equivalent (CO2eq), for an average lifespan of six years. A partial measurement that needs to be completed. Anne Yvrande-Billon, the director of economy, markets and digital at Arcep (the telecoms regulatory authority), calls for a global assessment of the impact of artificial intelligence, which would integrate the consumption of water, energy, but also mineral and metal resources for the manufacture of chips.

The energy consumption of generative AIs is linked to two main tasks: the preliminary training of language models and the inference phase, i.e. their use in the face of a real problem. The first, on which the majority of the literature focuses to date, sees its energy requirements increase as the models grow larger to gain in performance. “The first GPT model contained 120 million parameters, compared to 175 billion for GPT-3. Today, the exact architecture of GPT-4 is unknown, but it could be ten times larger than its predecessor, according to some estimates.”underlines Samuel Rincé, co-author of the Data for Good white paper. When we know that training ChatGPT-3 consumed 700,000 liters of fresh water and emitted 552 tons of CO2eq, we can imagine the impact of ChatGPT-4. Despite these few figures, no complete and methodologically detailed evaluation is available. The cause is in particular the incomplete communication of the digital giants. “There is a real transparency problem with Gafam, but also with start-ups like OpenAI, which do not provide any information on their energy consumption. Only Meta has made public figures on the carbon impact and energy consumption linked to the training of its four Llama models, namely 2.6 million kilowatt hours of electricity and 1000 tonnes of CO2.”points out Samuel Rincé.

Simple tasks, “thirty times less demanding”

To reduce this impact, one avenue is to use smaller, specialized language models, since the latter, trained on a smaller amount of data, consume less energy. But this is not a miracle solution, qualifies Denis Trystram, researcher in the Inria DataMove team and member of the Ecoinfo working group: “While their training may prove to be less energy-intensive, there is a risk of a multiplication of these specialized models, with, as a result, very high consumption due to inference.” Researchers from the Franco-American start-up Hugging Face and Carnegie Mellon University (Pennsylvania) published one of the first studies on inference in November 2023. They estimate that consumption during the use phase of generative AI is set to weigh more and more heavily compared to their training phase. In this study, Sasha Luccioni, Hugging Face’s AI and climate manager, however, states that the simple tasks of specialized models are much more efficient, while being “thirty times less energy-intensive”.

Related Articles:  Researchers help reveal a 'blueprint' for photosynthesis

Another way to reduce the footprint of generative AI is to improve the energy efficiency of data centers. In 2022, they consumed about 460 terawatt hours (TWh) of electricity. A figure that could reach 1,000 TWh by 2026, according to the International Energy Agency, equivalent to Japan’s electricity consumption. So much so that the United States, which has 33% of the world’s 8,000 data centers, is worried about electricity shortages that could thwart the Biden administration’s goals of closing coal-fired power plants, according to the Washington Post. For its part, Microsoft indicated that its data centers had used 6.4 billion liters of water in 2022, up 34% compared to 2021. But, according to Joseph Gonnachon, marketing director of 2CRSI, innovation in cooling methods and heat reuse will make it possible to achieve, in the coming years, up to 51% energy savings compared to traditional data centers.

“While it is undeniable that the energy efficiency of digital technology, which doubles every two years, is progressing very quickly, the multiplication of uses means that, to date, overall consumption is not falling,” warns Denis Trystram. This is what is called the rebound effect. For many researchers, the solution lies above all in a more sober use of generative AI. “For 90% of research, generative AI does not add value. But today, every company wants to have its internal ChatGPT,” says Hugues Ferreboeuf, project director at the think tank The Shift Project. Denis Trystram invites, for each case, “to question the balance between benefit and environmental impact.” A recommendation against the current trend of integrating generative AI into all software. Similar to what Microsoft is doing with its office suite.

Huge demand for electricity and water

A ChatGPT query requires ten times more electricity than a Google query

Training ChatGPT-3 consumed 700,000 liters of fresh water and emitted 552 tons of CO2eq

AI electricity consumption could be between 85 and 134 TWh per year by 2027, equivalent to that of Argentina

SOURCES : « The Growing Energy Footprint of AI », A. de Vries ;« The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink », D. Patterson et al.

You are reading an article from L’Usine Nouvelle 3731 – June 2024
Read the summary

1720079530
#success #artificial #intelligence #causing #carbon #footprint #industry #players #soar

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.