Let's talk about AI and climate impact
To resource an AI future, we must first ensure the protection and preservation of our natural resources.

Last week, I hosted a gathering of six women who all — in some way — work on or with artificial intelligence. Over a half-eaten cheese plate and the glow of vanilla spiced candles, we discussed what we’re witnessing in the San Francisco tech environment that both excites and concerns us. AI hype is in full swing in the city: “AI” is woven into the marketing copy of every billboard off the 101 freeway, and startups and big tech alike are dedicating whole team offsites to brainstorming how to “revolutionize” their business with generative AI.
One friend shared an experience at her employer’s recent AI brainstorm day:
“We spent an entire day talking about how we can use more AI, create more AI. How we can build more and more AI,” she told me. “But we didn’t speak at all to any of the issues surrounding AI. It was like they didn’t exist.”
“It was like they didn’t exist.” “They” being the sometimes uncomfortable, often inconvenient questions related to AI development such as: how do we eliminate bias from training datasets? Is it ethical to use LLMs that have been trained on copyrighted materials? And — one of the biggest elephants in the room — how do we grapple with AI’s impacts on climate change?
AI’s climate impact
I’ve been shocked by how little the climate impact of developing AI is discussed, even amongst the circles of people building it. Typically, when AI is mentioned in tandem with climate change, it is cited as an important solve for the pressing climate problems at hand.
“AI will be our saving grace,” technologists predict. “We can’t solve climate change without the power of artificial intelligence.”
These arguments aren’t incorrect. There currently are applications of AI that combat climate pressures, including tools that detect wildfires more quickly and track global methane emissions on a daily basis.
But while AI will provide new insights that may help us tackle carbon emissions and global warming, it’s not widely known that AI itself is a contributing factor to these issues. That the building, training and using of AI models requires vast amounts of energy and water: GPUs used to develop and maintain AI models use four times as much energy as those serving conventional cloud applications, for example. And researchers at UC Riverside found that training a large model like GPT-3 in US data centers can directly consume 700,000 liters of clean freshwater — enough to produce 370 BMWs or 320 Teslas. Those same researchers predicted that just one short Chat-GPT conversation (of roughly 20-50 questions and answers) consumes a 500-ML bottle of water.
These are very large, very concerning numbers — but they’re also not widely known. In my research for this article, I surveyed 15 friends and family to assess their knowledge of AI climate impact. Only about 20% were aware of the severity of AI’s climate footprint. There’s clearly not only work to do here in making AI more sustainable, but also in raising awareness amongst the general public that we need to do so.
It’s important to note that researchers are making important and significant strides to make AI more sustainable: researchers at the University of Minnesota Twin Cities, for example, have developed a “Computational Random-Access Memory” (CRAM) chip that could cut AI energy use by at least 1,000 times, while AI companies ideate ways to make data centers more sustainable through renewable energy. Microsoft has even tried putting data centers underwater to keep them cool — an initiative that, while initially promising, was scaled back a month ago.
As such — sustainability initiatives are, right now, works-in-progress; hypotheticals that promise benefits while AI models consume vast amounts of resources right now, in the present. So until we have more sustainable methods and practices, here are three key points on AI’s climate impact that should help bring awareness to your and your team’s AI use:
1. Generating an image using an AI model consumes as much energy as fully charging your smartphone.
How many times have you used a platform like DALL·E 3 to generate an image, only to regenerate it, and regenerate it again — to ultimately be dissatisfied with its imperfections and abandon the image altogether?
Researchers at Hugging Face and Carnegie Mellon University surveyed the carbon emissions associated with popular AI tasks to find that generating images was the most energy- and carbon-intensive task, and that generating 1,000 images with a powerful AI model creates as much carbon dioxide as driving over four miles in a gasoline-powered car.
In a world where we’ve retrained our psyches to remember to bring our own bags to grocery stores, I believe that this kind of needless resource consumption won’t be tolerated when more people are aware of its climate footprint. And, eventually, I think the excitement of generating a hyper-specific image to your exact needs will simmer. We might see sustainability movements that encourage “digital recycling” to promote the reusing of previously generated images, rather than perpetually creating new content that suits our requirements and preferences exactly.
2. Using large language models (LLMs) is far more energy intensive than using smaller LLMs (SLMs).
When we use large language models (like ChatGPT-4o), we’re creating a greater energetic footprint than if we used a smaller model, like GPT-4o mini or Google’s Gemma models. SLMs are more compact, enabling them to operate on less powerful hardware platform and use much less power. According to a report by Unesco, SLMs “may yet need to reach the raw performance levels of LLMs in handling highly complex tasks; however, initial reports suggest they can perform comparably well on narrower tasks if adequately trained and fine-tuned.”
So why aren’t we all using SLMs for our more basic workflows, like proofreading an essay or writing an email? Well, because we learned to use LLMs first, and that behavior is already ingrained. Many of us aren’t aware that there’s a more sustainable way of working with AI and regardless of the task at hand, many of us usually — without thinking of the climate impact — turn to an LLM for basic prompting when smaller models could easily handle the request.
As mentioned in the Unesco report, “Such usage patterns can be compared to blasting a cannon to swat a mosquito, resulting in unnecessary internet traffic and heightened computing power and energy consumption, as each prompt may entail substantial processing behind the scenes.” So the next time you’re about to open ChatGPT for a basic workflow, consider using a smaller model like Microsoft’s Phi-3-mini which uses much less energy and which even, according to Microsoft, “performs better than models twice it size.”
3. Microsoft’s carbon emissions have risen 30% since 2020, and Google’s have risen 48% since 2019 — largely due to data center expansion that supports AI development.
AI development — especially training large language models — requires vast amounts of computational power, and as companies like Microsoft and Google zero in on training larger models, they need to expand data center facilities to meet these demands. As a consequence, their energy consumption and carbon footprint has skyrocketed.
Data centers run continuously. They’re a non-stop operation that generate enormous heat, requiring extensive cooling systems — a process that also requires energy. As tech companies adopt a “stop at nothing” mindset to developing generative AI, climate pledges have been pushed to the wayside and if unchecked, the exponential growth in AI-related emissions could actually accelerate climate change.
This is no small problem, and will require a mix of regulation, industry-wide dedication to pursuing renewable energy sources and green AI initiatives, and greater required transparency and accountability from tech companies in their reporting. If we don’t prioritize such measures, we’re looking at a grim, exponential increase of consumption: the International Energy Agency projects that global electricity consumption from data centers and AI could more than double from 2022’s 460 terawatt-hours to over 1,000 terawatt-hours by 2026 — roughly equal to Japan’s total energy use (IEA).
Our use of AI tools — and, eventually, AI agents — is likely to increase. AI companies are continuously rolling out bigger and more effective versions of their products which require an increasing amount of energy to build and train, while knowledge workers are increasingly expected to use AI for productivity gains.
We are quickly sliding into reliance on a technology that we have not figured out how to build nor use sustainably. We are deploying technology to protect market share before thinking about how to protect our home planet.
Small shifts start with us. They start with bringing awareness to our daily use of AI, and they continue with holding AI companies to account to meet carbon emissions goals and work to craft sustainable solutions. As we consider how to resource our new AI future, we must also ensure that we, in parallel, prioritize the future of our most important resources: our water, our air, our Mother Earth.
Until next time,
Cecilia
RemAIning Human is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.
Building community around an essential topic
Know someone whose interested in learning more about AI? Forward on this email or click the button below to share.
RemAIning Human is written and edited by Cecilia Callas.
interesting post Cecilia. I like the idea of AI labs competing to make high performing AI models with the least environment impact during training and inference applications.
Similar to the various level of effort put into safety work, I think these distinctions can help the public make better select the AI labs and models they want to support that are the closely aligned to their values.
Thank you! This is a topic that has bothered me immensely, and I'm so glad you're addressing it. Climate sustainability affects us all, and poorer people and countries are impacted by our use. Thanks for your strong ethic.