How the Rising Use of Artificial Intelligence Impacts the Environment—and Paths to Sustainable AI
The AI Energy Footprint: Growth and Projections
Artificial intelligence workloads are sending data-center energy use and emissions soaring. Industry estimates vary, but all point to steep growth. The International Energy Agency (IEA) projects gAI energy surge: Data centers powering AI are already massive energy users and set to grow. Global AI datacenter electricity could double by the late 2020s (to ~945–1050 TWh). IDC projects AI-related datacenter demand growing ~45% annually, from ~23 TWh (2022) to ~146 TWh by 2027.
The AI Energy Footprint: Growth and Projections
Artificial intelligence workloads are sending data-center energy use and emissions soaring. Industry estimates vary, but all point to steep growth. The International Energy Agency (IEA) projects global data-center electricity demand will roughly double by 2030 (to ~945 TWh), fueled in part by AIiea.org. MIT researchers similarly note data-center use jumped from ~460 TWh in 2022 to an expected ~1,050 TWh by 2026. To put this in context, 460 TWh would rank data centers between the annual usages of France and Saudi Arabia; at 1,050 TWh they’d be comparable to Japan.
IDC forecasts a 44.7% CAGR in AI-related data-center energy: from about 23 TWh in 2022 to 146 TWh by 2027. That is roughly equal to the electricity consumption of a large country (by 2027, exceeding 2021 usage of Sweden or Argentina). Crucially, AI training and inference require far more power than typical IT tasks – GPUs and specialized chips drive up power draw, with AI infrastructure often 5–10× more energy-hungry than general-purpose cloud servers.
Already today, data centers account for ~1–3% of global energy use and CO₂. A recent analysis reported that data centers and the global aviation industry each emit about 3% of world carbon emissions, making them roughly tied. As AI ramps up, these shares could balloon: one estimate suggests data centers might need up to 21% of global electricity by 2030 once consumer AI usage (chatbots, image generation, etc.) is fully counted. Likewise, SPhotonix research warns that by 2026 data centers could consume as much electricity as Japan (∼1,000 TWh/year) and produce ~2.5 billion tonnes of CO₂ by 2030.
By comparison, traditional IT and internet services (search, email, video) have relatively fixed profiles, so this shift is striking. For example, AI queries are much heavier: a Goldman Sachs report found one ChatGPT question uses ~10× the electricity of a Google search. Engineers note that processing a million tokens of text (a typical ChatGPT prompt/response) emits the same carbon as driving a car 5–20 miles, and even generating a single AI image consumes as much energy as a smartphone recharge.
Overall, analysts warn that “AI’s energy impact” is rapidly growing. Already, data centers broadly (AI and non-AI combined) use about 1–2% of global power (near airline industry levels). If unchecked, AI-driven demands could take that to double digits by 2030. Fortunately, projections include factors like improved efficiency and renewable adoption. Even so, carbon output is trending upward: IDC notes AI datacenters are expected to contribute ~15% of all datacenter carbon emissions by 2027, and emissions will grow despite efficiency gains.
Key Environmental Challenges from AI
AI’s surge poses multiple environmental challenges beyond just more electricity. Three critical issues stand out: intense energy use per model (training & inference), e-waste from specialized hardware, and heavy water use for cooling.
Energy-intensive Model Training and Inference
Training large AI models is extremely power-intensive. For instance, an MIT study recounts that training GPT-3 (175B parameters) consumed about 1,287 MWh of electricity and emitted roughly 502 metric tons of CO₂ (equivalent to the emissions of ~112 cars in a year). Even earlier, a 2019 MIT report found training one NLP model could emit 626,000 lbs (~284 metric tons) of CO₂. These figures only cover training one version of the model; research teams often train or fine-tune models repeatedly for benchmarking and improvements, compounding the cost.
Once models are deployed, inference (running queries) can consume as much or more energy than training. Google estimates that of total AI energy use, 60% goes to inference and 40% to training. This is because a trained model like ChatGPT serves millions of queries per day. By late 2023, ChatGPT had ~100 million active users; one analysis claimed a single ChatGPT query might use ~100× the energy of a Google search. That surging demand means the electricity needed for inference is enormous and growing daily.
The upshot is that every jump to larger, more complex models drives much higher consumption. An MIT expert notes modern generative-AI training clusters “might consume seven or eight times more energy than a typical computing workload”. And with each advance in model size (GPT-3 → 4 → 5, multimodal models, etc.), the energy per training cycle climbs. This also stresses power grids: some forecasts expect North American data center power demand (driven by AI) to roughly double from ~2,700 MW (end of 2022) to ~5,300 MW by end of 2023, a record surge.
E-Waste from Specialized Hardware
AI also exacerbates the electronics and e-waste problem. Data centers rapidly replace hardware to pack in the newest GPUs and TPUs, and companies upgrade devices to support AI workloads. A recent IEEE Spectrum analysis cites a Nature Communications paper projecting that generative-AI adoption alone could generate ~2.5 million metric tons of e-waste per year by 2030. By contrast, global e-waste in 2022 was ~62 million tons, so AI’s share could become significant.
This e-waste includes discarded GPUs, CPUs, memory modules, batteries, PCBs, etc. High-performance GPUs (like NVIDIA’s A100/Blackwell) have limited lifespans as workloads intensify. Unlike consumer electronics, server hardware often goes straight to recycling. Because e-waste contains toxic metals and chemicals, this boom poses disposal and pollution risks. Analysts warn that awareness of AI’s e‑waste impact is “crucial for developing strategies to mitigate negative environmental impacts”.
Cooling and Water Use
Modern servers packed in dense racks generate huge heat. Traditionally, data centers use chillers and cooling towers, which consume both electricity and water. As rack densities climb (from ~20 kW/rack to 100+ kW/rack in AI clusters), cooling needs grow. Some facilities use significant water in evaporative cooling or for condensers.
For example, one quick estimate suggested a short ChatGPT conversation (10–50 exchanges) might use about 0.5 liters of fresh water for cooling. MIT researchers similarly note that data center cooling “draws from already stressed watershed areas” and can strain local water supplies. Liquid cooling can actually reduce water use (water can be recirculated or replaced by dielectric fluids), but most current centers still rely partly on water. Hence, AI’s growth increases both energy and water demands, further stressing environmental resources.
Case Studies: Tech Giants and Green Data Centers
Google’s Impact and Efforts
Google (Alphabet) is a bellwether for this issue. In 2023, Google’s own data center electricity soared, and its total greenhouse gas emissions rose 48% since 2019. The company attributes the spike to expanding data-center operations (driven by AI and other services) and supply-chain emissions. Notably, Google had previously claimed to be carbon-neutral (via offsets) since 2007, but in late 2023, it disclosed it was no longer carbon-neutral in operations – a sign that offsets were lagging behind growth.
Google is still pushing its 2030 targets. Officially, Google aims to reach net-zero emissions by 2030 across its operations, including running 24/7 carbon-free energy in its data centers. (Currently, Google matches all its electricity use with renewables on an annual basis, but 24/7 procurement is an active goal.) Google also invests in AI to cut its footprint – for instance, using ML to optimize data-center cooling and deploying AI for efficiency projects. Nevertheless, recent reporting underscores that “as we further integrate AI into our products, reducing emissions may be challenging”.
Microsoft’s Carbon Challenge
Microsoft, which operates Azure (the cloud behind OpenAI), has similarly seen rises in emissions. Its 2023 sustainability report noted emissions up 29% since 2020, driven by building more AI-optimized data centers. In 2023 alone, Microsoft released 15.4 million metric tons CO₂e (on a “market-based” basis) – about five times the annual emissions of Seattle. Most of Microsoft’s footprint is Scope 3 (supply chain and infrastructure), but data centers are a big driver. Microsoft still plans to go carbon-negative by 2030 and to remove all historical carbon by 2050, but the near-term growth in AI has made meeting those goals harder.
Water use is another concern: Microsoft’s report shows water consumption jumped 87% since 2020 (to ~2.1 billion gallons in 2023), mostly for cooling servers. To counter this, Microsoft is investing in water replenishment projects (putting back 3× as much water as it uses globally) and careful site selection. But some new data centers in arid areas (e.g. Arizona) have raised criticism.
In summary, Google and Microsoft illustrate the tension: both have ambitious clean-energy commitments, yet rapidly expanding AI infrastructure is pushing their emissions higher. (OpenAI itself does not publicly report an emissions figure, but as an Azure customer it will largely inherit Microsoft’s carbon mix. Microsoft has said Azure would be 100% carbon-neutral since 2012, but the details on AI workloads remain opaque.)
Sustainable Data Center Projects
Some operators are proactively building “green” AI data hubs. For example, Iceland is attracting AI data center investment due to its abundant geothermal and hydro power and natural cooling. New projects (e.g. a Borealis/Modularity AI hub) will be 100% renewably powered (geothermal + hydroelectric), with cool ambient climate cutting cooling energy. As one executive notes, Iceland’s cool climate “provides an efficient solution for data center cooling, significantly reducing energy consumption and carbon footprint”.
In Norway, companies like Polar Data Center plan AI-optimized facilities running on 100% hydroelectric power. The Tørdal site will feature advanced liquid cooling and Tier III design, all on green energy. Similarly, the broader Nordic region (Norway/Sweden/Finland) is popular because nearly all electricity comes from renewables (e.g. Norway ~98% hydro) and external temperatures allow free cooling. Even Finland hosts data centers in underground tunnels to use rock insulation and 100% green power.
These projects show how geography and design can slash AI’s footprint. By siting servers in cold, renewable-rich locales, operators can dramatically cut both power and cooling needs. Major hyperscalers are also experimenting: Microsoft, Google, and AWS now actively trial direct-to-chip liquid cooling in some sites, which not only lowers energy use but also can slash water use (one NVIDIA liquid-cooled platform claims 300× better water efficiency vs. air cooling).
Strategies for Sustainable AI
To rein in AI’s environmental toll, a multi-pronged approach is needed: smarter algorithms, better hardware, green power, and strong standards.
Actionable Recommendations
For AI Researchers: Embed sustainability early. Design task-specific models or use pruning/quantization to minimize waste. Track energy use with tools (e.g. CodeCarbon, MLCO₂, or Microsoft’s Emissions Impact tools) and report it. Include carbon or FLOP efficiency when publishing new models (“Green AI” benchmarks). Explore edge or federated approaches to reduce cloud reliance. Collaborate on benchmarks and platforms for energy metrics (Climatechange.ai and others are working on standards).
Recommended by LinkedIn
For Enterprises & Cloud Operators: Choose green data centers and providers. Contract with cloud regions certified for renewable power. Time-shift flexible workloads to hours when the grid is cleanest (carbon-aware scheduling). Invest in on-site renewables or battery storage to smooth supply. Implement liquid or efficient cooling, and recycle heat where possible. Track and report AI’s energy use in sustainability reports. Purchase offsets judiciously and prioritize internal energy reduction first.
For Policymakers and Regulators: Incentivize sustainable computing. Provide tax breaks or grants for renewable-powered data centers and efficient cooling tech. Set standards for data-center efficiency (e.g. ambitious PUE targets) and require disclosure of computing emissions (in company ESG reports or industry filings). Update procurement rules: governments should demand high efficiency and renewables in the AI services they use. Support R&D in green computing. The EU’s lead with the AI Act is one model: requiring transparency around AI energy use will pressure the industry to innovate.
In summary, the AI revolution need not forego environmental responsibility. By combining algorithmic savvy, cutting-edge hardware/cooling, and a shift to clean power, we can reap AI’s benefits while minimizing its footprint. The time to act is now, as AI’s energy and carbon tally mounts, every efficiency gain or megawatt of renewables deployed directly cuts future emissions. With coordinated effort from researchers, companies, and governments, the path to sustainable AI is clear.
lobal data-center electricity demand will roughly double by 2030 (to ~945 TWh), fueled in part by AIiea.org. MIT researchers similarly note data-center use jumped from ~460 TWh in 2022 to an expected ~1,050 TWh by 2026. To put this in context, 460 TWh would rank data centers between the annual usages of France and Saudi Arabia; at 1,050 TWh they’d be comparable to Japan.
IDC forecasts a 44.7% CAGR in AI-related data-center energy: from about 23 TWh in 2022 to 146 TWh by 2027. That is roughly equal to the electricity consumption of a large country (by 2027, exceeding 2021 usage of Sweden or Argentina). Crucially, AI training and inference require far more power than typical IT tasks – GPUs and specialized chips drive up power draw, with AI infrastructure often 5–10× more energy-hungry than general-purpose cloud servers.
Already today, data centers account for ~1–3% of global energy use and CO₂. A recent analysis reported that data centers and the global aviation industry each emit about 3% of world carbon emissions, making them roughly tied. As AI ramps up, these shares could balloon: one estimate suggests data centers might need up to 21% of global electricity by 2030 once consumer AI usage (chatbots, image generation, etc.) is fully counted. Likewise, SPhotonix research warns that by 2026 data centers could consume as much electricity as Japan (∼1,000 TWh/year) and produce ~2.5 billion tonnes of CO₂ by 2030.
By comparison, traditional IT and internet services (search, email, video) have relatively fixed profiles, so this shift is striking. For example, AI queries are much heavier: a Goldman Sachs report found one ChatGPT question uses ~10× the electricity of a Google search. Engineers note that processing a million tokens of text (a typical ChatGPT prompt/response) emits the same carbon as driving a car 5–20 miles, and even generating a single AI image consumes as much energy as a smartphone recharge.
Overall, analysts warn that “AI’s energy impact” is rapidly growing. Already, data centers broadly (AI and non-AI combined) use about 1–2% of global power (near airline industry levels). If unchecked, AI-driven demands could take that to double digits by 2030. Fortunately, projections include factors like improved efficiency and renewable adoption. Even so, carbon output is trending upward: IDC notes AI datacenters are expected to contribute ~15% of all datacenter carbon emissions by 2027, and emissions will grow despite efficiency gains.
Key Environmental Challenges from AI
AI’s surge poses multiple environmental challenges beyond just more electricity. Three critical issues stand out: intense energy use per model (training & inference), e-waste from specialized hardware, and heavy water use for cooling.
Energy-intensive Model Training and Inference
Training large AI models is extremely power-intensive. For instance, an MIT study recounts that training GPT-3 (175B parameters) consumed about 1,287 MWh of electricity and emitted roughly 502 metric tons of CO₂ (equivalent to the emissions of ~112 cars in a year). Even earlier, a 2019 MIT report found training one NLP model could emit 626,000 lbs (~284 metric tons) of CO₂. These figures only cover training one version of the model; research teams often train or fine-tune models repeatedly for benchmarking and improvements, compounding the cost.
Once models are deployed, inference (running queries) can consume as much or more energy than training. Google estimates that of total AI energy use, 60% goes to inference and 40% to training. This is because a trained model like ChatGPT serves millions of queries per day. By late 2023, ChatGPT had ~100 million active users; one analysis claimed a single ChatGPT query might use ~100× the energy of a Google search. That surging demand means the electricity needed for inference is enormous and growing daily.
The upshot is that every jump to larger, more complex models drives much higher consumption. An MIT expert notes modern generative-AI training clusters “might consume seven or eight times more energy than a typical computing workload”. And with each advance in model size (GPT-3 → 4 → 5, multimodal models, etc.), the energy per training cycle climbs. This also stresses power grids: some forecasts expect North American data center power demand (driven by AI) to roughly double from ~2,700 MW (end of 2022) to ~5,300 MW by end of 2023, a record surge.
E-Waste from Specialized Hardware
AI also exacerbates the electronics and e-waste problem. Data centers rapidly replace hardware to pack in the newest GPUs and TPUs, and companies upgrade devices to support AI workloads. A recent IEEE Spectrum analysis cites a Nature Communications paper projecting that generative-AI adoption alone could generate ~2.5 million metric tons of e-waste per year by 2030. By contrast, global e-waste in 2022 was ~62 million tons, so AI’s share could become significant.
This e-waste includes discarded GPUs, CPUs, memory modules, batteries, PCBs, etc. High-performance GPUs (like NVIDIA’s A100/Blackwell) have limited lifespans as workloads intensify. Unlike consumer electronics, server hardware often goes straight to recycling. Because e-waste contains toxic metals and chemicals, this boom poses disposal and pollution risks. Analysts warn that awareness of AI’s e‑waste impact is “crucial for developing strategies to mitigate negative environmental impacts”.
Cooling and Water Use
Modern servers packed in dense racks generate huge heat. Traditionally, data centers use chillers and cooling towers, which consume both electricity and water. As rack densities climb (from ~20 kW/rack to 100+ kW/rack in AI clusters), cooling needs grow. Some facilities use significant water in evaporative cooling or for condensers.
For example, one quick estimate suggested a short ChatGPT conversation (10–50 exchanges) might use about 0.5 liters of fresh water for cooling. MIT researchers similarly note that data center cooling “draws from already stressed watershed areas” and can strain local water supplies. Liquid cooling can actually reduce water use (water can be recirculated or replaced by dielectric fluids), but most current centers still rely partly on water. Hence, AI’s growth increases both energy and water demands, further stressing environmental resources.
Case Studies: Tech Giants and Green Data Centers
Google’s Impact and Efforts
Google (Alphabet) is a bellwether for this issue. In 2023, Google’s own data center electricity soared, and its total greenhouse gas emissions rose 48% since 2019. The company attributes the spike to expanding data-center operations (driven by AI and other services) and supply-chain emissions. Notably, Google had previously claimed to be carbon-neutral (via offsets) since 2007, but in late 2023, it disclosed it was no longer carbon-neutral in operations – a sign that offsets were lagging behind growth.
Google is still pushing its 2030 targets. Officially, Google aims to reach net-zero emissions by 2030 across its operations, including running 24/7 carbon-free energy in its data centers. (Currently, Google matches all its electricity use with renewables on an annual basis, but 24/7 procurement is an active goal.) Google also invests in AI to cut its footprint – for instance, using ML to optimize data-center cooling and deploying AI for efficiency projects. Nevertheless, recent reporting underscores that “as we further integrate AI into our products, reducing emissions may be challenging”.
Microsoft’s Carbon Challenge
Microsoft, which operates Azure (the cloud behind OpenAI), has similarly seen rises in emissions. Its 2023 sustainability report noted emissions up 29% since 2020, driven by building more AI-optimized data centers. In 2023 alone, Microsoft released 15.4 million metric tons CO₂e (on a “market-based” basis) – about five times the annual emissions of Seattle. Most of Microsoft’s footprint is Scope 3 (supply chain and infrastructure), but data centers are a big driver. Microsoft still plans to go carbon-negative by 2030 and to remove all historical carbon by 2050, but the near-term growth in AI has made meeting those goals harder.
Water use is another concern: Microsoft’s report shows water consumption jumped 87% since 2020 (to ~2.1 billion gallons in 2023), mostly for cooling servers. To counter this, Microsoft is investing in water replenishment projects (putting back 3× as much water as it uses globally) and careful site selection. But some new data centers in arid areas (e.g. Arizona) have raised criticism.
In summary, Google and Microsoft illustrate the tension: both have ambitious clean-energy commitments, yet rapidly expanding AI infrastructure is pushing their emissions higher. (OpenAI itself does not publicly report an emissions figure, but as an Azure customer it will largely inherit Microsoft’s carbon mix. Microsoft has said Azure would be 100% carbon-neutral since 2012, but the details on AI workloads remain opaque.)
Sustainable Data Center Projects
Some operators are proactively building “green” AI data hubs. For example, Iceland is attracting AI data center investment due to its abundant geothermal and hydro power and natural cooling. New projects (e.g. a Borealis/Modularity AI hub) will be 100% renewably powered (geothermal + hydroelectric), with cool ambient climate cutting cooling energy. As one executive notes, Iceland’s cool climate “provides an efficient solution for data center cooling, significantly reducing energy consumption and carbon footprint”.
In Norway, companies like Polar Data Center plan AI-optimized facilities running on 100% hydroelectric power. The Tørdal site will feature advanced liquid cooling and Tier III design, all on green energy. Similarly, the broader Nordic region (Norway/Sweden/Finland) is popular because nearly all electricity comes from renewables (e.g. Norway ~98% hydro) and external temperatures allow free cooling. Even Finland hosts data centers in underground tunnels to use rock insulation and 100% green power.
These projects show how geography and design can slash AI’s footprint. By siting servers in cold, renewable-rich locales, operators can dramatically cut both power and cooling needs. Major hyperscalers are also experimenting: Microsoft, Google, and AWS now actively trial direct-to-chip liquid cooling in some sites, which not only lowers energy use but also can slash water use (one NVIDIA liquid-cooled platform claims 300× better water efficiency vs. air cooling).
Strategies for Sustainable AI
To rein in AI’s environmental toll, a multi-pronged approach is needed: smarter algorithms, better hardware, green power, and strong standards.
Actionable Recommendations
For AI Researchers: Embed sustainability early. Design task-specific models or use pruning/quantization to minimize waste. Track energy use with tools (e.g. CodeCarbon, MLCO₂, or Microsoft’s Emissions Impact tools) and report it. Include carbon or FLOP efficiency when publishing new models (“Green AI” benchmarks). Explore edge or federated approaches to reduce cloud reliance. Collaborate on benchmarks and platforms for energy metrics (Climatechange.ai and others are working on standards).
For Enterprises & Cloud Operators: Choose green data centers and providers. Contract with cloud regions certified for renewable power. Time-shift flexible workloads to hours when the grid is cleanest (carbon-aware scheduling). Invest in on-site renewables or battery storage to smooth supply. Implement liquid or efficient cooling, and recycle heat where possible. Track and report AI’s energy use in sustainability reports. Purchase offsets judiciously and prioritize internal energy reduction first.
For Policymakers and Regulators: Incentivize sustainable computing. Provide tax breaks or grants for renewable-powered data centers and efficient cooling tech. Set standards for data-center efficiency (e.g. ambitious PUE targets) and require disclosure of computing emissions (in company ESG reports or industry filings). Update procurement rules: governments should demand high efficiency and renewables in the AI services they use. Support R&D in green computing. The EU’s lead with the AI Act is one model: requiring transparency around AI energy use will pressure the industry to innovate.
In summary, the AI revolution need not forego environmental responsibility. By combining algorithmic savvy, cutting-edge hardware/cooling, and a shift to clean power, we can reap AI’s benefits while minimizing its footprint. The time to act is now, as AI’s energy and carbon tally mounts, every efficiency gain or megawatt of renewables deployed directly cuts future emissions. With coordinated effort from researchers, companies, and governments, the path to sustainable AI is clear.