The financials here are so ugly: you have to light truckloads of money on fire forever just to jog in place.
It may be like looking at the early Google and saying they are spending loads on compute and haven't even figured how to monetize search, the investors are doomed.
Oh, I'd love to get a cheap H100! Where can I find one? You'll find it costs almost as much used as it's new.
At some point the AI becomes good enough, and if you're not sitting in a chair at the time, you're not going to be the next Google.
In practice that hasn't borne out. You can download and run open weight models now that are spitting distance to state-of-the-art, and open weight models are at best a few months behind the proprietary stuff.
And even within the realm of proprietary models no player can maintain a lead. Any advances are rapidly matched by the other players.
More likely at some point the AI becomes "good enough"... and every single player will also get a "good enough" AI shortly thereafter. There doesn't seem like there's a scenario where any player can afford to stop setting cash on fire and start making money.
Why?
I don't see why these companies can't just stop training at some point. Unless you're saying the cost of inference is unsustainable?
I can envision a future where ChatGPT stops getting new SOTA models, and all future models are built for enterprise or people willing to pay a lot of money for high ROI use cases.
We don't need better models for the vast majority of chats taking place today E.g. kids using it for help with homework - are today's models really not good enough?
Because training isn't just about making brand new models with better capabilities, it's also about updating old models to stay current with new information. Even the most sophisticated present-day model with a knowledge cutoff date of 2025 would be severely crippled by 2027 and utterly useless by 2030.
Unless there is some breakthrough that lets existing models cheaply incrementally update their weights to add new information, I don't see any way around this.