AI chatbots like OpenAI’s ChatGPT and Google’s Bard use a crazy amount of electricity and water – or more specifically, the huge data centers that fuel them. And as per the latest estimates, those energy demands are growing really fast, reaching massive levels.
According to a recent analysis by data scientist Alex de Vries at Vrije Universiteit Amsterdam in the Netherlandss found that by 2027, these server farms could be gobbling up anywhere between 85 to 134 terawatt hours of energy per year.
That’s about the same as how much electricity Argentina, the Netherlands, or Sweden use in a year, or just 0.5% of the world’s total energy needs. Sound familiar? The crypto industry, which has been widely mocked, has reached similar levels of power consumption in the past few years.
It’s a ginormous carbon footprint that experts say should make us rethink the massive investments being made in the AI field — and the ridiculously resource-intensive methods employed by tech giants like OpenAI and Google.
It’s actually pretty hard to pinpoint the exact number? Since companies like OpenAI are super secretive about how much energy they use. So, De Vries had to rely on some guesswork and looked at the sales of Nvidia A100 servers. Those bad boys supposedly make up around 95% of the AI industry’s infrastructure, or so they say 🙂
Lots of companies in California, especially, might get push-back sooner than you’d expect. Just recently, Governor Gavin Newsom of California passed two significant climate disclosure laws.
This means that companies like OpenAI and Google, along with around 10,000 other firms, have to reveal their carbon emissions by 2026.
It’s a concerning trend that is prompting some experts to suggest that we should consider taking a pause and reassessing the trend.
Reference- Futurism, Journal Joule, The New York Times, CNBC