The Honest Climate Case for AI

15 points by darccio


adrien

"Honest"

The article doesn't state anything negative and it's up to everybody but the companies and their users to do change the behaviour/rules/...

Out of the three positive cases it mentions, two are old or very old and are certainly far from the technology that is currently facing criticism. Even the third seems pretty different but it's at least quite recent.

I'm not saying the content of the article is wrong (computing power usage is dwarved by several other fields) but the article is certainly not "honest".

fleebee

Stop feeling guilty about prompts. Your Wh per query is not the lever that matters. You'll do more climate good by eating one less steak, taking one fewer flight, or voting for better energy policy than by boycotting LLMs.

First off, these things aren't mutually exclusive, so there's no reason you couldn't do all of them.

But I also take issue with this view where (non-)usage of LLMs supposedly doesn't matter. By rejecting LLMs we can signal to the industry that maybe they don't need to build so many new data centers. I don't think you should treat the industry's trajectory as an inevitability.

In that light, I think focusing solely on the inference costs is misleading. The costs of training and data center construction should be factored in. Although, to be honest, I don't even agree that the inference costs on their own are negligible: according to the post, newer models may require 10 to 100 times more energy and agentic flows can trigger an unbounded number of requests.

Any industry that uses about 1.5% of the world's electricity, like air conditioning or industrial motors, shouldn't be a cause for concern.

Is that supposed to sound like a small amount of energy for an industry in its infancy?

Yogthos

I'd also argue that we're basically in the mainframe era of this tech. We've seen this story many times before where new technology starts out needing big data centres to operate, then over time people learn how to optimize it and it moves to edge devices. I don't see any reason why this tech should be any different. We're still in the very early days and it's silly to assume that energy requirements are going to stay roughly constant going forward. We've already seen a huge improvement in efficiency where models you can run on your laptop outperform frontier models that needed a whole data centre just a few years ago. That's the trend that's important to keep in mind when projecting what we can expect in the future.

The other argument of course is that as this tech becomes more efficient than demand will grow negating the efficiency gains. But that's just a question of whether people find this tech genuinely useful or not. If usage grows that implies that people are finding reasons to use it because it solves some problem for them. It's not different from any other technology in that regard.