Estimate the carbon footprint of AI and machine learning operations. Calculate CO₂ emissions from model training, inference workloads, and data center energy consumption with regional carbon intensity factors.
You might also find these calculators useful
AI and machine learning workloads consume massive amounts of energy. Training GPT-3 emitted an estimated 552 tonnes of CO₂—equivalent to driving 1.2 million miles. Our AI Carbon Footprint Calculator helps you estimate emissions from training and running AI models based on GPU usage, energy consumption, and regional carbon intensity.
AI carbon footprint comes from two main sources: training (initial model creation, typically done once) and inference (running the model for predictions, ongoing). Training large models can take weeks on hundreds of GPUs, while inference emissions accumulate over millions of daily queries.
Carbon Footprint Formula
CO₂ = Energy (kWh) × Carbon Intensity (gCO₂/kWh)Companies increasingly need to report Scope 3 emissions, including cloud computing and AI workloads.
Energy consumption directly correlates with cloud costs. Reducing carbon often means reducing expenses.
Carbon intensity varies 20x between coal-heavy grids and renewable data centers. Choose greener regions.
Smaller, efficient models may suffice. Compare carbon cost of different model sizes before committing.
Training GPT-3 (175B parameters) produced an estimated 552 tonnes of CO₂—equivalent to 5 average Americans' annual emissions. GPT-4 is estimated at 10-50x more. Smaller models like LLaMA 7B produce roughly 50-100 tonnes depending on the training setup and data center location.
Training is intensive but happens once. Inference happens continuously. For widely-used models (millions of daily queries), inference emissions can exceed training within months. ChatGPT's inference is estimated to emit 25,000+ tonnes CO₂ annually—far exceeding its training footprint.
Key strategies: 1) Use renewable-powered data centers (Google, Azure offer carbon-neutral options), 2) Choose efficient model architectures, 3) Use quantization to reduce compute, 4) Implement caching to avoid redundant inference, 5) Schedule training during low-carbon grid hours.
Carbon intensity varies dramatically: 20-50 gCO₂/kWh for renewable grids (Norway, hydro-powered), 200-300 for Europe average, 400+ for US coal states, 500+ for parts of Asia. Running on 100% renewable energy can reduce emissions by 90%+ vs coal-heavy grids.
These are order-of-magnitude estimates. Actual emissions depend on: GPU utilization rates, data center PUE (power usage effectiveness), cooling efficiency, precise carbon intensity at runtime, and whether carbon offsets are applied. For precise reporting, use cloud provider carbon dashboards.