Editor choice

2023-10-13

Will AI ruin our energy consumption?

New research reveals the alarming energy footprint of artificial intelligence systems, especially large language models (LLMs). Some models already consume megawatt-hours daily, rivaling national energy usage.

Data centers have high energy needs, spurring efficiency optimizations like natural cooling. But LLMs like ChatGPT have 10x the energy intensity of search engines.

One ChatGPT query consumes 2.9Wh. With its current user volume, estimates show 564MWh daily - equal to 52 US households' annual usage. Training is even more intensive, requiring terawatt-hours for models like GPT-3 and Gopher.

Projections suggest AI could consume 85-134 terawatt-hours annually by 2027 - on par with Netherlands, Argentina, Sweden or Ireland's total usage. Expanding adoption and more powerful models drive the spike.

Supply chain issues currently limit AI server deployment, but forecasted improvements align with surging demand as AI permeates business and society.

On top of existing concerns like data privacy and job loss, AI's massive predicted energy appetite raises new questions about sustainability and responsibility.

With climate change intensifying, AI's environmental impact will fall under greater scrutiny. More efficient methods and energy sources will be imperative.

This research highlights the importance of holistic AI progress encompassing ethics, regulation, and social responsibility alongside rapid technological advances.

Share with friends:

Write and read comments can only authorized users