The Environmental impact of AI // ChatGPT is sending your individual carbon footprint through the roof

If you’re watching content online, and in general spend time on the internet, you have most likely already consumed AI-generated products, that being either AI-generated images of what your future children will look like, a Homer Simpsons version of Teenage Dirtbag or perhaps you just asked Siri what time it is. Just for the chaos I would see in the comments, let’s pretend that that’s all AI is.

We have talked about tech, crypto, data storage, hardware, and the impact of the internet several times on this channel; because not only is tech something that tremendously impacts our lives every single day, but tech is also constantly evolving, developing and improving. As such, looking at tech, and the tools we use online are important sustainability factors. I have wanted to make a video about AI for a long time, and finally, I can sit down and share some thoughts with you.

https://www.youtube.com/watch?v=amZs3ltZ5uw

So without further ado, let’s look at the impact of AI.

AI in all its forms is used for a lot more than generating funny filters. AI technologies are being used to predict the stock market. Applications like Google Maps, and Uber use AI to analyze data in real-time, to provide the most efficient routes, traffic conditions, and estimated arrival times. AI play an important role in national, and global security systems, and provides algorithms that can diagnose cancer at an early stage. Needless to say, it’s a vastly used, and vastly important tool.

In January 2024, the International Energy Agency (IEA) issued its forecast for global energy use over the next two years. For the first time, they included projections for electricity consumption associated with data centres, cryptocurrency, and artificial intelligence. The IEA estimates that, added together, this usage represented almost 2 per cent of global energy demand in 2022 — and that demand for these uses could double by 2026, which would make it roughly equal to the amount of electricity used by the entire country of Japan (Calvert, 2024).

How does this make sense? Let me explain.

A data center is what makes you being online possible. This video is data stored in a data centre that you then gain access to via your internet connection. The internet, and “the cloud” isn’t an actual cloud, it is a physical place or more precisely thousands of physical places. Our actions online, our tools, our email, our games, our movies, our music, and our documents are all some kind of data. Storing data requires energy.

A data centre heats up when processing, just like your laptop can get warm and overheat, and to keep the data centres from overheating (and thus shutting down the internet) they’re constantly cooled down. This requires energy, like any other cooling system, and that energy has typically been coming from fossil fuels. Aaaaand that’s where the majority of the impact comes from.

Studies from the IEA show that data centres generate 3.5% of the global greenhouse gas (GHG) emissions, which is more than the aviation industry. Yeah, the internet emits more CO2 than aeroplanes.

Apart from GHG emissions, there are two other types of impact to take into about, water and the production of hardware. If the data centres are cooled using liquid, instead of air, loads of water are used. Globally, a lot of centres are placed in areas that are prone to droughts, which is not a great combination. Data centres can use up to 5 million gallons of water a day.

The lifespan of equipment also has an impact, because tech is moving fast, and equipment gets worn down fast when constantly in use, so data centres replace their equipment extremely frequently. This increases e-waste, which is rarely disposed of properly and is also one of the largest growing categories of waste today which pollutes water, and air and is extremely dangerous for people to handle. Furthermore, the production of new tech requires a vast amount of materials, minerals, and precious metals which all have to be mined and produced. I have several videos about the problems with the production of tech, so I’ll link those down below, but all those points go for phones, kitchen appliances, computers, and yeah datacenters equipment as well. We’re talking soil erosion, child labour and loss of natural habitats, and let’s now forget how Western tech companies are exploiting the countries in which they operate their mines.

So the existence of the internet and all the opportunities and tools it allows for us to use come with an impact – simply storing 100GB of data in the cloud every year would produce 200 kg (or 0.2 tons) of CO2.

Now why are processes like AI or crypto more energy-intensive than an Instagram post or a regular Google search?

The GPT in ChatGPT stands for Generative Pre-trained Transfer, and it uses specialised algorithms to find patterns and sequences across the internet to match your search. This is called generative AI. These programs are “trained” to generate responses and results based on the data input it is given, and that’s typically a lot.

When you ask ChatGPT for a recipe, and to help write your paper, the AI model generates a response based on the data it’s trained in. Which can be thousands and thousands of recipes. This means that what the chat will give you might not be an actual recipe from somewhere else, but rather it’s a uniquely generated product on the data available to the programme.

To put it simply: A chatGPT request requires 10 times as much energy as a Google search.

Training a large language model like ChatGPT uses (Luccioni, 2022) nearly 1,300 megawatt-hours (MWh) of electricity, the annual consumption of about 130 US homes (Vincet, 2024). According to the IEA, a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents.

In this study, they looked at BLOOM, a 176-billion parameter language model, across its life cycle, and had this to conclude:

“BLOOM’s final training emitted approximately 24.7 tonnes of CO2eq if we consider only the dynamic power consumption, and 50.5 tonnes if we account for all processes ranging from equipment manufacturing to energy-based operational consumption” (Luccioni, 2022). 

It’s not just using generative AI models that require energy, training them takes a toll too

We want bigger and better generative AI models, and those don’t just appear out of thin air, they’re “trained”. This means that to create a program that can write algorithms, creative recipes or write poems and give detailed accounts of world events the AI model will have to go through as much data as possible as many times as possible. This means running for thousands of hours and going through thousands of GPUs (which are the chips that make the generative AI possible). And because we want these programs better and better, every new generation of learning model consume more and more energy to achieve it. As such, more and more gas and coal is used to keep this process going. 

And although we’re seeing more and more renewable energy introduced into the tech industry, that positive impact gets completely swallowed by AI when it constantly needs more energy to improve, so in many ways we end up back where we started.

Fun fact, Mark Zuckerberg placed an order of 350K H100 Nvidia GPUs. A street price of 10,5  billion USD. That order was equivalent to the entire yearly revenue of the company in 2020. Microsoft buys thousands as well with the main purpose of exploiting the chip for generative AI solutions (Observer, 2024).

What’s the solution?

If data centres were to switch from fossil fuel-derived energy to a renewable energy source, a lot of the AI-related issues could be improved – but not resolved (so hold off on the champagne), because we need a lot more renewable energy to make up for the increased energy consumption of AI.

We also need more transparency in the AI field, some learning models for AI are more energy-intensive than others; different models and algorithms can use 30-40 times more energy than others while generating the same result. So if generative AI and machine learning are here to stay consumers should be able to recognise the resources their activities require – just like we can on our energy bill. The organisations best placed to share the collected impact, energy use and emissions related to AI tools— companies like Meta, Microsoft, and OpenAI — simply aren’t sharing the relevant information.

Moreover, transparency is a huge issue in general when it comes to these generative AI models, not just on the energy side. Often they become a black box with an input and output, with no way of knowing what operations or which data was used in the process to create the final output. This creates privacy issues for consumers because we don’t know if our Facebook photos are used to train AI models, if you use Adobe programs all you create is used to train these models as well. It also creates a whole new kind of copyright infringement, when artists’ work is used to train AI. But that’s a whole other can of worms, of which I have loads to say as well.

Both consumers, governments, and companies should also consider how we’re using AI and machine learning models. Right now, we’re in a bit of an AI Summer where it’s super hyped and thus everyone wants to use it, whether it makes sense or not. Because it’s trending.

But the way I see it we shouldn’t leave all the processing to ChatGPT, while it might be able to write scripts for YouTube videos or do your homework, we should think about if that’s necessary, or more important what happens if we let it?

I have talked to several people who don’t understand why I spend weeks working on scripts, researching, reading, taking notes and thinking about the topic when I could just pop a prompt into an AI model and have my script ready in seconds. But there is a really good reason for why I am not doing that, and why I never will.

It’s easy to end up relying too much on AI to do the boring tasks of researching, but AI has its limitations.  Big changes may have occurred “in the real world” that the service is unaware of because the changes happened after the dataset was collected and used for training, as such you might end up saying something outdated or wrong without even knowing it if you rely solely on AI.

I want to understand, I want to learn, not recite. I don’t get anything from just reading generated text, I also want to be critical of my sources, who is saying something, and why, and those nuances are washed out in ChatGPT, at least that’s often the case. It has also been known to make up sources that do not exist. Researching, while time-consuming, also trains and develops cognitive skills, concentration, critical thinking, and patience and maintains my academic curiosity. While machine learning and generative AI tools have many applications, we should still reflect on why and how we use them. Meanwhile, the tech industry needs to take responsibility for its impact and design systems with sustainability built in from the beginning.


Advertisements

Recommended Articles

Leave a Reply

Your email address will not be published.

follow me on instagram ma frens