LLMs are cheap

24 points by jsnell


simonw

My favorite way to illustrate this point is to highlight how much it would cost to use a vision LLM to generate descriptions of all 70,000 photos in my personal photo library.

With Gemini 1.5 Flash 8B - the cheapest Gemini vision model - that cost for all 70,000 photos comes to approximate $1.70. That’s not a typo: it really would cost less than $2.

I’ve heard from someone I trust at Google that they aren’t operating their Gemini models at a net loss per prompt processed.

It’s rare for me to find any prompt that costs more than a cent to run against the models I frequently use. Most of the API prompts I run cost 1/10th of a cent or less.

I’ve published a bunch of notes on llm-pricing, and I also maintain this pricing calculator tool.