I don’t see how anyone could now doubt GenAI has hit a wall

OpenAI launched GPT-4.5 yesterday1Sam Altman post on X announcing GTP-4.5 release – a model they’ve spent two years and a fortune training. Initial impressions? Slightly better at some things, but noticeably worse at others2Ethan Mollick’s first impressions of GPT-4.5, and it’s eye-wateringly expensive – 30 times the cost of GPT-4o and 5 times more than their high-end “01 Reasoning” model3OpenAI’s API pricing table.

This follows X’s recent release of Grok3 – only marginally better than most (not all) high end existing models, despite, again, billions spent on training.

Then there’s Anthropic’s recently released Claude Sonnet 3.7 “hybrid reasoning” model. Supposedly tuned for coding, but developers in the Cursor subreddit are saying it’s *worse* than Claude 3.5.

What makes all this even more significant is how much money has been thrown at these next-gen models. Anthropic, OpenAI, and X have collectively spent hundreds of billions of dollars over the past few years (exponentially more than was ever spent on models like GPT-4). Despite these astronomical budgets, performance gains have been incremental and marginal – often with significant trade-offs (especially cost). Nothing like the big leaps seen between GPT-3.0, GPT-3.5 and GPT-4.

This slowdown was predicted by many. Not the least a Bloomberg article late last year highlighting how all the major GenAI players were struggling with their next-gen model (whoever wrote that piece clearly had good sources).

It’s becoming clear that this is likely as good as it’s going to get. That’s why OpenAI is shifting focus – “GPT-5” isn’t a new model, it’s a product4Sam Altman post on X on OpenAI roadmap.

If we have reached the peak, what’s left is a long, slow reality check. The key question now is whether there’s a viable commercial model for GenAI at roughly today’s level of capability. GenAI remains enormously expensive to run, with all major providers operating at huge losses. The Nvidia GPUs used to train these models cost around $20k each, with thousands needed for training. It could take years – possibly decades – for hardware costs to fall enough to make the economics sustainable

Leave a Reply

Your email address will not be published. Required fields are marked *