Would you like to be more efficient in your day-to-day work?
Imagine being able to double your productivity and reduce working hours by using advanced tools to help you with your most repetitive tasks. Not only that, these tools can also save you from wasting time reading large volumes of information to find an answer.
How about personalized education accessible to more people?
These are some of the promises behind the use of chatbots such as Chat GPT, but a lot of people still don’t know today that all of these come with a potentially high price in water consumption and CO2 release to the atmosphere.
Hi, this is Climate Resilience with Cinthia and together, we’ll explore evidence-based information to face climate change.
Although what I am about to say is not the relevant point here, I think it is important to acknowledge the fact that these systems do hallucinate or “lie” when replying to us as their priority is to create a response not to be reliable… And, the most advanced versions of these systems, such as GPT o1 have shown in tests that they do try to deceive the user and when questioned about it, they lie about doing so.
But back to the point, let’s start by analyzing how these tools require both energy and water.
Large language models, which form the basis of advanced chatbots, rely on powerful AI accelerators for efficient training and inference. NVIDIA GPUs are some of the most widely used. These accelerators work together in large-scale data centers. This is why several of the estimates on energy consumption are based on these chips.
This estimated energy consumption for 2024 has not been confirmed, it was based on a sales forecast, assuming all NVidia chips expected to be sold would run at 75% of their capacity. To give the data some context, this energy consumption is compared against the power used by 1 million US households, and the whole power used by some small countries in 2021.
The reason why the energy consumption is worrisome is that there is NOT enough green energy being generated in the world yet… and that means, depending on where the data center is located, that generating the energy required by these systems leads to the release of greenhouse gases.
And to make things worse, nowadays when you perform a simple google search, it generally uses AI as well.
This is one of the articles that tries to make an estimate of the energy consumption when using Chat GPT type tools. According to their data, every time you search in Google and it shows you a summary made with artificial intelligence, 10 times more energy is consumed than what was documented in 2009.
Something that is important to consider is that all chatbot systems first need to digest huge amounts of information, this is called the training phase. This training phase is required every time there is a version upgrade…
Here you can notice the estimated energy and amount of CO2 that was released when training GPT – 2, GPT – 3 and GPT – 4… you can also see how many more hours are needed in each case.
During this training phase, these systems do not produce results for us, the users, but they do consume resources.
Some estimates suggest that the consumed energy and released CO2 when training GPT-3 is approximately equivalent to driving 123 gasoline-powered cars for a year.
After training comes the “inference” phase, which is when users interact with the system to obtain something from it.
Despite the fact that training is very costly in terms of energy, training happens only once per version, whereas using an AI model continues to consume energy over time. This means that most of the energy used for AI will eventually come from inference. It also means that the energy requirements for running AI models will have a significant impact on the overall energy use for AI systems.
On the other hand, among the various applications of artificial intelligence models, particularly in inference tasks, image generation stands out as one of the most energy-intensive processes.
There are estimates that suggest that generating one image can consume as much energy as a full charge of your smartphone.
Others suggest that producing 1000 images with Stable Diffusion XL releases as much CO2 as driving above 6.5 kilometers in an average car.
More recent estimates based on GPT-4 suggest that the LLM consumes the equivalent of 14 LED light bulbs for 1 hour to write a 100 word e-mail and if 1 out of 10 working Americans requested that task weekly for a year, it would equal the electricity consumed by all DC households for 20 days.
At the beginning of the video I talked about water consumption? But, what does water have to do with anything here?
Modern computing devices, including your personal computer, rely on semiconductor materials. These devices generate heat during operation, especially when performing intensive tasks like rendering complex video game graphics. To manage this heat, computers use cooling systems, often including fans. You may have noticed that when your computer is working harder, these fans spin faster, resulting in increased noise.
This is because generating complex graphics requires significant processing power, which increases electricity consumption and produces heat as a byproduct. If this heat is not adequately dissipated, it can cause the temperature of the components to rise. Semiconductors, which are essential for processing, must operate within a specific temperature range to function correctly. Therefore, fans work harder.
In the case of the data centers behind the bots like Chat GPT, instead of huge fans, they have cooling systems that use industrial amounts of water…
Microsoft, for example, revealed that its global water consumption soared by 34% from 2021 to 2022, this sharp increase represents the water of more than 2500 Olympic swimming pools and is linked to the development of AI and its cooperation with OpenAI, the company behind Chat GPT.
On the other hand, Google recorded a 20% increase in water consumption over the same period, also largely attributable to artificial intelligence.
To give you more concrete examples… this pre-print estimates how many questions you can ask Chat GPT3 before it consumes half a liter of water in its cooling system to answer you.
In Texas, half a liter of water is consumed with 36 questions, while in Ireland it is about 70 questions and at the other extreme, if you live in Washington, the state, with only 10 questions.
More recent estimates based on GPT-4 suggest that the LLM consumes 519 mililiters to write a 100 word e-mail and the regular usage of this system multiplies. In this example, if 1 out of 10 working Americans requested that task weekly for a year, it would equal the water consumed by all Rhode Island households in a day and a half.
And do you remember the training phase?
These other comparisons are meant to give a more palpable idea of how much water was used by Microsoft and Meta while training GPT-3 and LlaMa 3.
Water scarcity in the world is ALREADY an issue that should keep us busy in its solution…
These tools, like any other, have advantages if we know how to use them, but let’s be aware that everything has a cost…
Goofing around with Chat GPT, or idly generating images… releases CO2 into the atmosphere worsening climate change and contributes to worsen water scarcity for everyone.
Having said that, in this channel we try to focus on how to improve our chances in the face of climate change…
Do we have any even slight alternatives?
I see three paths we can follow from the user point of view…
1. Limiting unnecessary interactions seems to be the most reasonable answer: Only use chatbots when truly needed, avoiding frivolous or repetitive queries that consume energy without providing significant value.
I know this point is debatable, for example in this article, scientists compared the environmental impacts of using LLMs to create 500 a word page versus the environmental impact associated with the equivalent time that a human would need.
US Human-to-LLM ratios range from 40 to 150 for a typical LLM, such as Llama-3-70B and from 1200 to 4400 for a lightweight LLM, such as Gemma-2B-it.
The ratios are much smaller when compared to human labor in India: between 3.4 and 16 for a typical LLM and between 130 and 1100 for a lightweight LLM
Location | LLM | Range |
US | Typical (Llama-3-70B) | 40 – 50 |
US | Lightweight (Gemma-2B-it) | 1200 – 4400 |
India | Typical (Llama-3-70B) | 3.4 – 16 |
India | Lightweight (Gemma-2B-it) | 130 – 1100 |
These numbers were crunched under several assumptions which I won’t question here… BUT, (maybe my personal take here is silly), BUT… it is not like the humans disappear while the LLM is doing their job… the humans those tested LLMs substitute, still exist, still consume water and release CO2… so, I personally am not that convinced the use of LLMs represent a “real” saving. Except for the fact that whoever needed those 500 words written would pay far less by using the LLM instead of a human… but that would take us to a totally different conversation.
2. Search engines now use AI constantly even when you are not expecting them to do so, but at least on Google there are some ways to avoid the use of AI summaries.
For example, I asked google how many minutes The image on the left is a summary google presented me to answer my question: how many minutes are needed to prepare green tea.
Here are three ways to prevent the AI summary:
- Write -ai at the search bar
- Click on the “web” tab.
- Use a proxy site such as udm14.com. When you use it, you will notice that it still searches on google but it adds this small code after the search.
3. Societal pressure may be helpful to encourage companies and research labs to publish the carbon footprints of their AI models. In the future, perhaps consumers could even use this information to choose a “greener” chatbot.
Never underestimate our collective power. Harness your social networks to spread awareness and share this crucial information. Advocate for responsible use of these technologies and demand that companies take necessary steps to reduce energy and water consumption. By uniting our voices, we can drive significant change and create a lasting impact.
Sources:
Castro, D. (2024). Rethinking concerns about AI’s energy use. In Center for Data Innovation. Retrieved December 26, 2024, from https://datainnovation.org
Cohen, J. (2024, October 31). How to Remove AI From Your Google Search Results. PC Magazine. Retrieved December 26, 2024, from https://www.pcmag.com/how-to/how-to-remove-ai-from-your-google-search-results
Heikkilä, M. (2023, December 1). Making an image with generative AI uses as much energy as charging your phone. MIT Technology Review. https://www.technologyreview.com/2023/12/01/1084189/making-an-image-with-generative-ai-uses-as-much-energy-as-charging-your-phone/
Leffer, L. (2024, February 20). The AI boom could use a shocking amount of electricity. Scientific American. https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity/
Li, P., Yang, J., Islam, M. A., & Ren, S. (2023, April 6). Making AI Less “Thirsty”: Uncovering and addressing the secret water footprint of AI models. arXiv.org. https://arxiv.org/abs/2304.03271
Luccioni, S., Jernite, Y., & Strubell, E. (2024). Power hungry processing: Watts driving the cost of AI deployment? 2022 ACM Conference on Fairness, Accountability, and Transparency. https://doi.org/10.1145/3630106.3658542
Matt O’Brien,Hannah Fingerhut,The Associated Press. (2023, September 9). A.I. tools fueled a 34% spike in Microsoft’s water consumption, and one city with its data centers is concerned about the effect on residential supply. Fortune. https://fortune.com/2023/09/09/ai-chatgpt-usage-fuels-spike-in-microsoft-water-consumption/
Ren, S., Tomlinson, B., Black, R. W., & Torrance, A. W. (2024). Reconciling the contrasting narratives on the environmental impact of large language models. Scientific Reports, 14(1). https://doi.org/10.1038/s41598-024-76682-6
Saenko, K. (2024, February 20). A computer scientist breaks down generative AI’s hefty carbon footprint. Scientific American. https://www.scientificamerican.com/article/a-computer-scientist-breaks-down-generative-ais-hefty-carbon-footprint/
Statista. (2024a, July 17). Global electricity consumption 1980-2022. https://www.statista.com/statistics/280704/world-power-consumption/
Statista. (2024b, December 10). Energy consumption of Nvidia’s high-end GPU 2024. https://www.statista.com/statistics/1446532/energy-consumption-nvidia-microchip/
TRG Datacenters. (2024, January 4). AI Chatbots: Energy usage of 2023’s most popular chatbots (so far) | TRG Datacenters. https://www.trgdatacenters.com/resource/ai-chatbots-energy-usage-of-2023s-most-popular-chatbots-so-far/
Verma, P., & Tan, S. (2024, September 18). A bottle of water per email: the hidden environmental costs of using AI chatbots. The Washington Post. https://wapo.st/3Dy2sKz
Zeff, M. (2024, December 6). OpenAI’s o1 model sure tries to deceive humans a lot. TechCrunch. https://techcrunch.com/2024/12/05/openais-o1-model-sure-tries-to-deceive-humans-a-lot