How Companies Can Mitigate AI’s Growing Environmental Footprint
Author: Christina Shim

By 2026, computing power dedicated to training AI is expected to increase tenfold over 2024 levels. As more power is expended, more resources are needed. As a result, we’ve seen exponential increases in energy and, perhaps more unexpectedly, water consumption. Some estimates even show running a large AI model generates more emissions over its lifetime than the average car. A recent report from Goldman Sachs found that by 2030, there will be a 160% increase in demand for power propelled by AI applications.
We know there is palpable environmental risk to operating this way indefinitely, but we also know AI can be a powerful new tool for sustainability, accelerating how quickly we solve problems, helping us understand and cope with climate change, and supporting the nascent energy transition.
AI adoption is the new normal for businesses and governments seeking to enhance decision-making, increase business productivity, and lower costs. That’s why we need to consider more sustainable AI practices now, while also prioritizing AI use cases to power overall sustainability gains.
How can we effectively use AI and reap its benefits while minimizing environmental impact to the best of our collective ability?
Make Smart Choices About AI Models
An AI model has three phases—training, tuning, and inferencing—and there are opportunities to be more sustainable at every phase. At the start of an AI journey, business leaders should consider choosing a foundation model rather than creating and training code from scratch. Compared to creating a new model, foundation models can be custom tuned for specific purposes in a fraction of the time, with a fraction of the data, and for a fraction of the energy costs. This effectively “amortizes” upfront training costs over a long lifetime of different uses.
It’s also important to choose the right size foundation model. Most models have different options, with 3 billion, 8 billion, 20 billion, or more parameters. Bigger is not always better. A small model trained on high-quality, curated data can be more energy efficient and achieve the same results or better depending on your needs. IBM research has found that some models trained on specific and relevant data can perform on par with ones that are three to five times larger, but faster and with less energy consumption. The good news for businesses is that likely means lower costs and better outcomes too.
Locate Your Processing Thoughtfully
Often, a hybrid cloud approach can help companies lower energy use by giving them flexibility about where processing takes place. With a hybrid approach, computing may happen in the cloud at data centers nearest the needs. Other times, for security, regulatory, or other purposes, computing may happen “on prem”—in physical servers owned by a company.
A hybrid approach can support sustainability in two ways. First, it can help you colocate your data next to your processing, which can minimize how far the data must travel and add up to real energy savings over time. Second, this can let you choose processing locations with access to renewable power. For example, two data centers may offer similar performance for your needs, but one draws on hydropower and the other on coal.
Lastly, it’s important to only use the processing you need. Many organizations over-provision how much compute power is standing ready for their needs when software already exists to do better. In one case of our own AI workloads, IBM was able to reduce the excess, standby “headroom” from the equivalent of 23 to 13 graphics processing units (GPUs), significantly lowering energy usage and freeing up high-demand GPUs for other purposes—with zero reduction in performance.
Use the Right Infrastructure
Once you’ve chosen an AI model, about 90% of its life will be spent in inferencing mode, where data is run through it to make a prediction or solve a task. Naturally, the majority of a model’s carbon footprint occurs here also, so organizations must invest time and capital in making data processing as sustainable as possible.
AI runs most efficiently on processors that support very specific types of math. It is well known that AI runs better on GPUs than central processing units (CPUs), but neither were originally designed for AI. Increasingly, we are seeing new processor prototypes, which are designed from scratch to run and train deep learning models faster and more efficiently. In some cases, these chips have been shown to be 14 times more energy efficient.
Energy-efficient processing is the absolute most important step to take, because it reduces the need for water-based cooling and even for additional renewable power, which often incurs its own forms of environmental costs.
Go Open Source
Being open means more eyes on the code, more minds on the problems, and more hands on the solutions. That level of transparent collaboration can have a huge impact. For example, the open-source Kepler project—free and available to all—helps developers estimate the energy consumption of their code as they build it, allowing them to build code that achieves their goals without ignoring the energy trade-offs that will impact long-term costs and emissions.
Open source also means tapping the “wisdom of crowds” to make existing AI models better instead of tapping our energy grids to forever build new models. These models will let resource-limited organizations pursue cost-effective innovation and reassure skeptical organizations with flexibility, safety, and trustworthiness.
The largest open-source project in history—the internet—was originally used to share academic papers. Now, it underpins much our economy and society.
Similarly, as we envision how AI may help bring about a better future, we must strive for innovation while simultaneously being mindful and responsible about the options we have and the natural resources involved.
TAKEAWAYS
As AI adoption accelerates, its energy and resource demands are increasing at an alarming rate. By 2030, AI-related power consumption is expected to rise by 160%. To balance AI’s benefits with its ecological impact, companies must adopt smarter, more sustainable AI practices.
Choose efficient AI models. Use foundation models rather than training from scratch; smaller, high-quality models can reduce energy costs.
Optimize processing locations. A hybrid cloud approach minimizes data travel and leverages renewable energy sources.
Use energy-efficient infrastructure. Specialized processors can reduce energy use by up to 14 times compared to traditional CPUs and GPUs.
Embrace open source. Shared AI models and energy-conscious coding practices improve efficiency and reduce redundant computing.
Plan for long-term sustainability. Strategic choices in AI development today can significantly cut environmental impact over time.
Please Log in to leave a comment.