AI is simultaneously one of the fastest-growing sources of carbon emissions and one of the most powerful tools for fighting climate change. This piece follows the data on both sides of a paradox that will define the decade.
The Carbon Cost of Intelligence
Start with the uncomfortable number. Global data centers consumed approximately 415 terawatt-hours of electricity in 2024, roughly 1.5% of all electricity generated worldwide. That figure is projected to reach 945 TWh by 2030, driven overwhelmingly by AI workloads. In the United States alone, data centers used 183 TWh in 2024, and the Department of Energy projects that number will surge to 426 TWh by 2030, a 133% increase.
These are not abstract numbers. They translate directly into carbon. According to a 2025 study published in Nature Sustainability, the carbon footprint of AI systems could reach between 32.6 and 79.7 million tons of CO2 in 2025, with a water footprint of 312 to 765 billion liters. For context, 79.7 million tons is roughly equivalent to the annual emissions of Belgium.
The water numbers are equally striking. Training GPT-3, a model that is now three generations behind the frontier, directly evaporated 700,000 liters of freshwater at Microsoft’s data centers for cooling alone. That does not include the indirect water consumed by the power plants generating electricity for those servers. U.S. data centers directly consumed 66 billion liters of water in 2023, more than triple the 2014 figure. And these facilities are increasingly being built in regions already facing water stress.
Ireland, a European data center hub, now dedicates 21% of its national electricity supply to data centers. The IEA estimates that share could reach 32% by 2026. In Virginia, the U.S. state with the highest concentration of data centers, these facilities consume 26% of the electricity grid. The city of Dublin has reached a staggering 79% allocation. This is not a future problem. It is a current one.
How Fossil Fuels Are Filling the Gap
The inconvenient truth behind the AI boom is where the electricity comes from. Despite ambitious corporate pledges to run on 100% renewable energy, the math does not add up at the current pace of expansion.
A Goldman Sachs Research analysis projected that approximately 60% of the increased electricity demand from data centers will be met by burning fossil fuels, adding an estimated 220 million tons of CO2 to global emissions. This is not because tech companies prefer fossil fuels. It is because renewable energy deployment cannot keep pace with the exponential growth in AI compute demand. Building a solar farm takes 3 to 5 years from permitting to operation. Building a data center takes 18 months.
The result is a growing gap between corporate climate pledges and operational reality. Google’s greenhouse gas emissions rose 48% between 2019 and 2024, despite the company’s stated goal of achieving net-zero emissions by 2030. Microsoft’s emissions increased 29% over a similar period. Both companies attributed the increases primarily to data center expansion for AI services.
| Metric | 2024 Baseline | 2030 Projection | Change |
|---|---|---|---|
| Global data center electricity | 415 TWh | 945 TWh | +128% |
| U.S. data center electricity | 183 TWh | 426 TWh | +133% |
| AI-specific CO2 emissions | 33-80 Mt | 24-44 Mt/year (AI alone) | Growing |
| Data center water use (U.S.) | 66B liters | Not yet projected | Rising fast |
| Ireland grid share | 21% | 32% | +11 pts |
Nuclear power is re-entering the conversation as a result. Microsoft signed a deal in 2024 to restart a unit at the Three Mile Island nuclear plant in Pennsylvania specifically to power data centers. Amazon acquired a nuclear-powered data center campus in Pennsylvania. Google signed agreements with Kairos Power for small modular reactors. These moves reflect a candid assessment within the industry: the AI scaling trajectory requires baseload power that renewables alone cannot yet provide.
The Other Side: AI as a Climate Solution
Here is where the story gets complicated, because the same technology driving those emissions is also producing some of the most powerful climate solutions ever deployed.
Energy grid optimization is the most immediate win. AI models predict sunlight intensity, wind patterns, and electricity demand in real time, enabling grid operators to balance supply and demand dynamically rather than relying on carbon-intensive peaker plants. Google’s DeepMind reduced the energy used for cooling its own data centers by 40% using reinforcement learning. When applied to national grids, similar optimization techniques have reduced curtailment, the waste of excess renewable energy, by 15 to 20% in pilot programs across Europe and the United States.
Weather prediction has been transformed. NVIDIA’s Earth-2 platform generates high-resolution weather forecasts at a fraction of the cost and time of traditional numerical models. Better forecasts mean better integration of wind and solar power, since grid operators can anticipate renewable output hours or days in advance and plan accordingly. The European Centre for Medium-Range Weather Forecasts found that AI-based models matched or exceeded the accuracy of their physics-based models for 10-day forecasts while running 1,000 times faster.
Carbon capture and sequestration benefits from AI-driven optimization of injection rates, reservoir modeling, and monitoring. AI models simulate how CO2 behaves underground at speeds that traditional computational fluid dynamics cannot match, reducing the cost and uncertainty of geological storage. Similarly, AI-powered monitoring of methane leaks from oil and gas infrastructure using satellite imagery has identified thousands of previously undetected super-emitter events.
Materials science may deliver the largest long-term impact. DeepMind’s GNoME project used AI to discover 2.2 million new crystal structures, including hundreds of candidates for more efficient solar cells, better battery cathodes, and improved catalysts for green hydrogen production. Traditional materials discovery takes years per candidate. AI screens thousands per hour. The potential downstream effect on renewable energy costs is difficult to overstate.
The Accounting Problem Nobody Has Solved
The fundamental challenge is this: nobody has a reliable way to measure whether AI’s climate benefits outweigh its climate costs.
When Google uses AI to optimize a wind farm and reduces curtailment by 15%, the avoided emissions are real but counterfactual. They represent energy that would have been wasted but was not. Meanwhile, the emissions from the GPU cluster that trained the optimization model are concrete and measurable. Climate accounting systems are designed to track actual emissions, not hypothetical savings. This asymmetry makes it nearly impossible to produce an honest net assessment.
A November 2025 study from Cornell University proposed a framework that could cut CO2 impacts by approximately 73% and water impacts by 86% compared to worst-case scenarios through smart siting, faster grid decarbonization, and operational efficiency improvements. The key finding was that where you build matters enormously. A data center powered by Iowa’s wind-heavy grid produces a fraction of the emissions of an identical facility on West Virginia’s coal-heavy grid. But market forces push data centers toward cheap land and available power, not toward the cleanest grids.
The MIT Technology Review published a detailed analysis in 2025 examining AI’s full energy footprint. Their conclusion: the narrative that AI will solve climate change and the narrative that AI is destroying the climate are both oversimplifications. The truth depends entirely on policy choices, corporate accountability, and whether the efficiency gains from AI applications are deployed at sufficient scale to offset the growing energy appetite of AI infrastructure itself.
The efficiency paradox: AI makes individual processes more efficient, but efficiency gains historically increase total consumption rather than reducing it. This pattern, known as the Jevons paradox, suggests that making AI inference cheaper and faster will lead to dramatically more AI usage, potentially overwhelming any per-query efficiency improvements. Managing this dynamic is the central policy challenge of AI’s climate impact.
What Would Tip the Balance
The question is not whether AI will have a net positive or negative climate impact. The question is what has to change to ensure the balance tips in the right direction.
Mandatory carbon reporting for AI workloads is the first prerequisite. Currently, most AI companies report emissions at the corporate level, bundling AI compute with office heating and employee commuting. Workload-level carbon accounting, measuring the emissions of specific training runs and inference deployments, would create the transparency needed for informed decisions. The EU is moving in this direction with extensions to its Corporate Sustainability Reporting Directive, but implementation timelines remain unclear.
Renewable energy procurement must accelerate dramatically. Google, Microsoft, and Amazon have collectively pledged to become “water positive” and carbon neutral or negative by 2030. Meeting these goals while doubling or tripling AI compute will require renewable energy deployment at a pace never previously achieved. Industry groups estimate that AI data centers alone will need 80 to 120 gigawatts of new clean energy capacity by 2030, roughly equivalent to the entire installed solar capacity of the United States today.
Model efficiency research deserves as much funding as model capability research. Techniques like distillation, quantization, and mixture-of-experts architectures have already reduced the compute cost of inference by orders of magnitude compared to 2022 baselines. But these gains are being reinvested in running larger models rather than in reducing total energy consumption. Directing even a fraction of AI R&D budgets toward efficiency-first optimization could change the trajectory meaningfully.
Three concrete actions would make the largest difference in the shortest time: require data centers to report energy consumption per unit of AI compute, incentivize co-location of data centers with renewable energy generation, and fund independent auditing of corporate AI climate claims. None of these require new technology. They require political will and institutional accountability.
The climate impact of AI is not predetermined. It is a choice, distributed across thousands of decisions about where to build, how to power, and what to prioritize. The technology itself is neutral. The infrastructure and policy around it are not.
Frequently Asked Questions
A single ChatGPT query consumes roughly 10 times more electricity than a standard Google search, approximately 0.01 to 0.03 kWh depending on the model and response length. That sounds small, but at hundreds of millions of queries per day, it adds up. The IEA estimates that if AI chatbot queries reached the scale of Google Search, the additional electricity demand would be 10 TWh per year, roughly the annual consumption of 1.5 million U.S. homes. Training a large model is far more energy-intensive: a single training run for a frontier model can consume several gigawatt-hours, equivalent to the annual electricity use of hundreds of households.
Mostly not, at the pace required. Google’s emissions rose 48% between 2019 and 2024 despite a 2030 net-zero goal. Microsoft’s increased 29%. Both attributed the growth to data center expansion. The pledges remain in place, and both companies are investing heavily in renewable energy and nuclear power, but the gap between committed goals and current trajectory is widening. The core tension is that AI revenue growth is outpacing clean energy deployment, and no major company has shown willingness to throttle AI expansion to meet climate targets.
Potentially, but only with deliberate policy intervention. The Cornell roadmap study showed that smart siting and grid decarbonization could reduce AI’s carbon impact by 73% and water impact by 86%. If those reductions are achieved while AI continues to optimize energy grids, accelerate materials discovery, and improve climate modeling, the net balance could be positive. But this outcome requires proactive choices. Left to market forces alone, the growth in AI energy consumption is likely to outpace the efficiency gains that AI delivers to other sectors.