24/7 Space News
TECH SPACE
Responding to the climate impact of generative AI
illustration only
Responding to the climate impact of generative AI
by Adam Zewe | MIT News
Boston MA (SPX) Oct 01, 2025

In part 2 of our two-part series on generative artificial intelligence's environmental impacts, MIT News explores some of the ways experts are working to reduce the technology's carbon footprint.

The energy demands of generative AI are expected to continue increasing dramatically over the next decade.

For instance, an April 2025 report from the International Energy Agency predicts that the global electricity demand from data centers, which house the computing infrastructure to train and deploy AI models, will more than double by 2030, to around 945 terawatt-hours. While not all operations performed in a data center are AI-related, this total amount is slightly more than the energy consumption of Japan.

Moreover, an August 2025 analysis from Goldman Sachs Research forecasts that about 60 percent of the increasing electricity demands from data centers will be met by burning fossil fuels, increasing global carbon emissions by about 220 million tons. In comparison, driving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide.

These statistics are staggering, but at the same time, scientists and engineers at MIT and around the world are studying innovations and interventions to mitigate AI's ballooning carbon footprint, from boosting the efficiency of algorithms to rethinking the design of data centers.

Considering carbon emissions

Talk of reducing generative AI's carbon footprint is typically centered on "operational carbon" - the emissions used by the powerful processors, known as GPUs, inside a data center. It often ignores "embodied carbon," which are emissions created by building the data center in the first place, says Vijay Gadepally, senior scientist at MIT Lincoln Laboratory, who leads research projects in the Lincoln Laboratory Supercomputing Center.

Constructing and retrofitting a data center, built from tons of steel and concrete and filled with air conditioning units, computing hardware, and miles of cable, consumes a huge amount of carbon. In fact, the environmental impact of building data centers is one reason companies like Meta and Google are exploring more sustainable building materials. (Cost is another factor.)

Plus, data centers are enormous buildings - the world's largest, the China Telecomm-Inner Mongolia Information Park, engulfs roughly 10 million square feet - with about 10 to 50 times the energy density of a normal office building, Gadepally adds.

"The operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbon, too, but we need to do more on that front in the future," he says.

Reducing operational carbon emissions

When it comes to reducing operational carbon emissions of AI data centers, there are many parallels with home energy-saving measures. For one, we can simply turn down the lights.

"Even if you have the worst lightbulbs in your house from an efficiency standpoint, turning them off or dimming them will always use less energy than leaving them running at full blast," Gadepally says.

In the same fashion, research from the Supercomputing Center has shown that "turning down" the GPUs in a data center so they consume about three-tenths the energy has minimal impacts on the performance of AI models, while also making the hardware easier to cool.

Another strategy is to use less energy-intensive computing hardware.

Demanding generative AI workloads, such as training new reasoning models like GPT-5, usually need many GPUs working simultaneously. The Goldman Sachs analysis estimates that a state-of-the-art system could soon have as many as 576 connected GPUs operating at once.

But engineers can sometimes achieve similar results by reducing the precision of computing hardware, perhaps by switching to less powerful processors that have been tuned to handle a specific AI workload.

There are also measures that boost the efficiency of training power-hungry deep-learning models before they are deployed.

Gadepally's group found that about half the electricity used for training an AI model is spent to get the last 2 or 3 percentage points in accuracy. Stopping the training process early can save a lot of that energy.

"There might be cases where 70 percent accuracy is good enough for one particular application, like a recommender system for e-commerce," he says.

Researchers can also take advantage of efficiency-boosting measures.

For instance, a postdoc in the Supercomputing Center realized the group might run a thousand simulations during the training process to pick the two or three best AI models for their project.

By building a tool that allowed them to avoid about 80 percent of those wasted computing cycles, they dramatically reduced the energy demands of training with no reduction in model accuracy, Gadepally says.

Leveraging efficiency improvements

Constant innovation in computing hardware, such as denser arrays of transistors on semiconductor chips, is still enabling dramatic improvements in the energy efficiency of AI models.

Even though energy efficiency improvements have been slowing for most chips since about 2005, the amount of computation that GPUs can do per joule of energy has been improving by 50 to 60 percent each year, says Neil Thompson, director of the FutureTech Research Project at MIT's Computer Science and Artificial Intelligence Laboratory and a principal investigator at MIT's Initiative on the Digital Economy.

"The still-ongoing 'Moore's Law' trend of getting more and more transistors on chip still matters for a lot of these AI systems, since running operations in parallel is still very valuable for improving efficiency," says Thomspon.

Even more significant, his group's research indicates that efficiency gains from new model architectures that can solve complex problems faster, consuming less energy to achieve the same or better results, is doubling every eight or nine months.

Thompson coined the term "negaflop" to describe this effect. The same way a "negawatt" represents electricity saved due to energy-saving measures, a "negaflop" is a computing operation that doesn't need to be performed due to algorithmic improvements.

These could be things like "pruning" away unnecessary components of a neural network or employing compression techniques that enable users to do more with less computation.

"If you need to use a really powerful model today to complete your task, in just a few years, you might be able to use a significantly smaller model to do the same thing, which would carry much less environmental burden. Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI," Thompson says.

Maximizing energy savings

While reducing the overall energy use of AI algorithms and computing hardware will cut greenhouse gas emissions, not all energy is the same, Gadepally adds.

"The amount of carbon emissions in 1 kilowatt hour varies quite significantly, even just during the day, as well as over the month and year," he says.

Engineers can take advantage of these variations by leveraging the flexibility of AI workloads and data center operations to maximize emissions reductions. For instance, some generative AI workloads don't need to be performed in their entirety at the same time.

Splitting computing operations so some are performed later, when more of the electricity fed into the grid is from renewable sources like solar and wind, can go a long way toward reducing a data center's carbon footprint, says Deepjyoti Deka, a research scientist in the MIT Energy Initiative.

Deka and his team are also studying "smarter" data centers where the AI workloads of multiple companies using the same computing equipment are flexibly adjusted to improve energy efficiency.

"By looking at the system as a whole, our hope is to minimize energy use as well as dependence on fossil fuels, while still maintaining reliability standards for AI companies and users," Deka says.

He and others at MITEI are building a flexibility model of a data center that considers the differing energy demands of training a deep-learning model versus deploying that model. Their hope is to uncover the best strategies for scheduling and streamlining computing operations to improve energy efficiency.

The researchers are also exploring the use of long-duration energy storage units at data centers, which store excess energy for times when it is needed.

With these systems in place, a data center could use stored energy that was generated by renewable sources during a high-demand period, or avoid the use of diesel backup generators if there are fluctuations in the grid.

"Long-duration energy storage could be a game-changer here because we can design operations that really change the emission mix of the system to rely more on renewable energy," Deka says.

In addition, researchers at MIT and Princeton University are developing a software tool for investment planning in the power sector, called GenX, which could be used to help companies determine the ideal place to locate a data center to minimize environmental impacts and costs.

Location can have a big impact on reducing a data center's carbon footprint. For instance, Meta operates a data center in Lulea, a city on the coast of northern Sweden where cooler temperatures reduce the amount of electricity needed to cool computing hardware.

Thinking farther outside the box (way farther), some governments are even exploring the construction of data centers on the moon where they could potentially be operated with nearly all renewable energy.

AI-based solutions

Currently, the expansion of renewable energy generation here on Earth isn't keeping pace with the rapid growth of AI, which is one major roadblock to reducing its carbon footprint, says Jennifer Turliuk MBA '25, a short-term lecturer, former Sloan Fellow, and former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.

The local, state, and federal review processes required for a new renewable energy projects can take years.

Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.

For instance, a generative AI model could streamline interconnection studies that determine how a new project will impact the power grid, a step that often takes years to complete.

And when it comes to accelerating the development and implementation of clean energy technologies, AI could play a major role.

"Machine learning is great for tackling complex situations, and the electrical grid is said to be one of the largest and most complex machines in the world," Turliuk adds.

For instance, AI could help optimize the prediction of solar and wind energy generation or identify ideal locations for new facilities.

It could also be used to perform predictive maintenance and fault detection for solar panels or other green energy infrastructure, or to monitor the capacity of transmission wires to maximize efficiency.

By helping researchers gather and analyze huge amounts of data, AI could also inform targeted policy interventions aimed at getting the biggest "bang for the buck" from areas such as renewable energy, Turliuk says.

To help policymakers, scientists, and enterprises consider the multifaceted costs and benefits of AI systems, she and her collaborators developed the Net Climate Impact Score.

The score is a framework that can be used to help determine the net climate impact of AI projects, considering emissions and other environmental costs along with potential environmental benefits in the future.

At the end of the day, the most effective solutions will likely result from collaborations among companies, regulators, and researchers, with academia leading the way, Turliuk adds.

"Every day counts. We are on a path where the effects of climate change won't be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense," she says.

Related Links
Computer Science and Artificial Intelligence Laboratory
Space Technology News - Applications and Research

Subscribe Free To Our Daily Newsletters
Tweet

RELATED CONTENT
The following news reports may link to other Space Media Network websites.
TECH SPACE
US tech company Cloud HQ announces $4.8 bn data center project in Mexico
Mexico City (AFP) Sept 25, 2025
US-based tech company Cloud HQ said Thursday it will spend $4.8 billion building six data centers in Mexico, the latest in a series of major US investments in the country amid trade tensions with Washington. Other companies to have recently announced projects in Mexico include brewing giant Heineken, retailer Walmart and streaming service Netflix. "If Mexico wants to advance in artificial intelligence, we need data centers," President Claudia Sheinbaum told reporters Thursday. Cloud HQ CEO K ... read more

TECH SPACE
U.S. and U.K. execute joint satellite maneuver in milestone space operation

NASA will say goodbye to the International Space Station in 2030

NASA launches mission to study space weather

NASA announces 10 new astronaut candidates

TECH SPACE
Space: Framatome and ENEA sign MoU to explore advanced technological solutions for designing lunar nuclear fission reactors

SpaceX, ULA launch rockets from Cape Canaveral

SpaceX, ULA plan rocket launches Thursday morning from Cape Canaveral

SpaceX launches 28 Starlink satellites into low Earth orbit

TECH SPACE
Wind driven rovers show promise for low cost Mars missions

NASA's ESCAPADE craft returns to Florida for fall mission to Mars

Mars polar vortex traps cold and builds seasonal ozone layer

Predicting Martian aurora to safeguard future explorers

TECH SPACE
Constellations of Power: Smart Dragon-3 and the Geopolitics of China's Space Strategy

China advances lunar program with Long March 10 ignition test

Chinese astronauts expand science research on orbiting space station

China planning for a trillion-dollar deep space economy by 2040

TECH SPACE
Planet expands satellite production with new Berlin facility

Globalstar moves to expand satellite network with new spectrum plan

Planet plans $300 million convertible notes offering maturing 2030

Sidus Space sets terms for $9.8 million stock sale

TECH SPACE
Responding to the climate impact of generative AI

Creator says AI actress is 'piece of art' after backlash

Electronic Arts to be bought by Saudi-led consortium for $55 bn

US tech company Cloud HQ announces $4.8 bn data center project in Mexico

TECH SPACE
The first animals on Earth may have been sea sponges, study suggests

NASA's Tally of Planets Outside Our Solar System Reaches 6,000

Exoplanets unlikely to host global oceans

Molecular 'fossils' offer microscopic clues to the origins of life - but they take care to interpret

TECH SPACE
NASA Study: Celestial 'Accident' Sheds Light on Jupiter, Saturn Riddle

Methane gas revealed on dwarf planet Makemake by JWST observations

Fresh twist to mystery of Jupiter's core

Jupiter birth dated through ancient molten rock droplets in meteorites

Subscribe Free To Our Daily Newsletters




The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.