
AI tools vary in their environmental impact as energy demands grow
Generative AI systems like chatbots require vastly different amounts of energy to run, with the largest models emitting significantly more carbon despite offering limited gains in accuracy, new research shows.
Sachi Kitajima Mulkey reports for The New York Times.
In short:
- A Department of Energy report projects U.S. data centers could consume up to 12% of the nation’s electricity by 2028 due to AI growth, up from 4.4% today, potentially increasing reliance on fossil fuels.
- A new study analyzing 14 open-source large language models found that models with advanced reasoning abilities used much more energy per answer, but were not significantly more accurate than smaller models.
- Because the energy source powering AI data centers varies by region, actual emissions differ widely depending on whether facilities rely on coal, natural gas, or renewables.
Key quote:
“We don’t always need the biggest, most heavily trained model, to answer simple questions. Smaller models are also capable of doing specific things well.”
— Maximilian Dauner, Ph.D. student, Munich University of Applied Sciences and lead author of the study
Why this matters:
AI systems are expanding rapidly, with tools like chatbots embedded in everything from classrooms to corporate offices. But the energy required to train and operate these models — especially the largest ones — is not trivial. Each user query adds to the growing electricity demands of data centers, which are often powered by fossil fuels. As AI usage scales, the emissions generated could hinder efforts to reduce global carbon output. And because performance gains taper off with larger models, the environmental cost may not always be justified. Understanding the energy and carbon footprint behind everyday AI interactions is essential as society weighs the convenience of intelligent systems against the climate crisis.
Related: Elon Musk’s AI chatbot downplays climate risks, boosting fringe views