31join
to vote

The Tokenized Economy of LLMs

fly.io

Read article ↗

Thomas Ptacek on the economics of LLM inference - why tokens are the new compute primitive and what this means for how we build and price AI products.

Thomas brings his security-engineer rigor to AI economics. The analysis of inference cost curves vs. pricing is the clearest I've seen anywhere.

1 comments

Join OpenLinq to join the discussion
priyaCurator·453 rep·3/17/2026

The cost curve analysis is eye-opening. Inference costs are dropping faster than Moore's Law but pricing is dropping slower. The margin expansion is happening at the infrastructure layer, not the application layer.