Summary :
- The energy used by the world’s data centers contributes to 2.5% to 3.7% of worldwide greenhouse gas emissions, surpassing that of the aviation sector. It is believed that the training of a sizable AI model can lead to the release of around 300 tons of CO2.
- ChatGPT is a massive and energy-consuming language model. The equivalent of each query was calculated to be 4.32 g of CO2 (for reference, a Google search produces 0.2g per query). This calculator indicates that 16 queries equate to the emissions produced by boiling a kettle. If every distinct visit generates around 10 queries, that totals 15 trillion queries monthly.
- A study published in Nature in February 2024 shows that the carbon emissions from AI writing and illustrating are lower than those from humans, comparing the CO2 equivalent produced by popular text creators to that of human efforts. Training GPT-3 produces approximately 552 metric tons of CO2e, whereas training BLOOM generates 50.5 metric tons of CO2e.
As artificial intelligence (AI) continues to advance, there is growing focus on both its possible advantages and the environmental impacts of its implementation. A crucial element of concern is the carbon footprint of AI – an assessment of greenhouse gas emissions linked to the functioning of AI-driven systems.
In the modern world, where AI is taking on a growing role in fields ranging from healthcare to transportation, entertainment to manufacturing, grasping the ecological effects of these technologies is becoming essential. In discussions regarding climate change and the necessity to lower greenhouse gas emissions, examining the carbon footprint of AI is increasingly becoming a focus for scientists, engineers, and policymakers.
Also Read:
What is the energy consumption of artificial intelligence AI?
Whenever you utilize artificial intelligence to generate an image, compose a text, or pose a question to a chatbot, you are incurring a cost on the environment. How so? Energy is crucial for AI to function.
Models of artificial intelligence utilize server infrastructure located in data centers. Handling and saving data demands significant energy and water resources for cooling purposes. A majority of this energy is likely derived from non-renewable sources, indicating that emissions from burning fossil fuels for energy production remains the largest global contributor to greenhouse gas emissions.
The power required to sustain all AI systems uses electricity equivalent to that of a small nation. Alex de Vries released the research in Digiconomist Joule, concluding that the yearly electricity demand from artificial intelligence globally could rise from 85.4 to 134 terawatt hours by 2027. The energy required for AI is comparable to that used for mining cryptocurrencies. You can learn about the environmental effects of cryptocurrencies here.
The energy used by the world’s data centers contributes to 2.5% to 3.7% of worldwide greenhouse gas emissions, surpassing that of the aviation sector. It is believed that the training of a sizable AI model can lead to the release of around 300 tons of CO2.
AI language model compared to carbon footprint
Not every type of task requires large language models. Employing smaller AI models for straightforward tasks is a method to conserve energy – models with limited capabilities, as opposed to those designed for all tasks, are more cost-effective and, by consuming less energy, are also more sustainable.
What is the carbon footprint of a single ChatGPT query in CO2 emissions?
ChatGPT is a massive and energy-consuming language model. The equivalent of each query was calculated to be 4.32 g of CO2 (for reference, a Google search produces 0.2g per query). This calculator indicates that 16 queries equate to the emissions produced by boiling a kettle. If every distinct visit generates around 10 queries, that totals 15 trillion queries monthly.
The precise environmental impact of AI operations remains unknown. The creators of the newest models do not offer specific emissions data.
Which has a superior equivalent – human article creation or AI text production?
While it may be challenging to assess the precise emissions of an AI system, and various sources provide varying figures, two essential factors are vital: model training and emissions per query. Texting by humans produces greater emissions, which also vary based on hardware and location.
A study published in Nature in February 2024 shows that the carbon emissions from AI writing and illustrating are lower than those from humans, comparing the CO2 equivalent produced by popular text creators to that of human efforts. Training GPT-3 produces approximately 552 metric tons of CO2e, whereas training BLOOM generates 50.5 metric tons of CO2e.
Other aspects, including chip and server energy consumption, along with recycling at the end of life, have a significantly lesser effect than training and query emissions. These findings suggest that the total effect of AI queries, encompassing both training and operation, is generally around a few grams of CO2e for each query.
To determine a writer’s carbon footprint, we considered their writing speed, their place of residence, and the energy use and emissions generated by the computer devices they employed for writing. It also considered if the author utilized renewable energy to operate their computer.
Also Read:
- How Mineralization Can Help Reducing Carbon Emission
- Biomass Carbon Removal & Storage (BiCRS) for Remove Carbon Pollution
#zonaebt #EBTHeroes #Sebarterbarukan
Editor: Savira Oktavia
References:
[1] AI’s carbon footprint – how does the popularity of artificial intelligence affect the climate?