AI's Rising Electricity Demand Raises Concerns, But Improvements in Efficiency Could Help
-
Recent interest in AI models like ChatGPT is raising worries about surging electricity use for AI processing.
-
Operating trained AI models (inference) may contribute significantly to lifecycle costs, not just model training.
-
Google adding AI to search could consume as much electricity as Ireland (29 TWh/year) in a worst case scenario.
-
More realistic estimates put AI server electricity use at 5.7 - 8.9 TWh in 2023, still minor compared to total datacenter use.
-
Improvements in model efficiency may be offset by greater AI adoption, limiting reductions in electricity consumption.