All links

AI uses too much energy

If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents.

(From: AI already uses as much energy as a small country. It’s only the beginning. - Vox)

This is from an interview with Sasha Luccioni, climate researcher at Hugging Face. In it, she explains what the power and water consumption of AI, specifically LLMs, looks like today. It's bad, the amount of energy required is enormous. One example in the post is that a query to an LLM cost almost 10 times as much energy as a query to a regular search engine. That's unsustainable, even if we manage to use 100% renewable energy and water that we really didn't need for anything else.

Once again, this begs the question if we really need all the AI applications companies are rushing into their products. It's often completely unnecessary.

It reminds me of eating animals. With all we know about animal welfare and climate impact, we've got to consider if (regularly) eating animals has benefits that outweigh those downsides.

Everyone can choose to do whatever they want with the information they have available to them. As a person or as a company. But if you're deciding for a company, the impact is larger, it's the decision times the amount of users. For me it's increasingly clear I don't want to use these “AI” solutions in personal workflows, suggest we might as well use them when I give talks, let alone push for integrating them into the products I work on.