I wouldn't even be surprised if they were losing money on paying ChatGPT users on inference compute alone, and that isn't even factoring in the development of new models.
There was an interesting article here (can't find the link unfortunately) that was arguing that model training costs should be accounted for as operating costs, not investments, since last year's model is essentially a total write-off, and to stay competitive, an AI company needs to continue training newer frontier models essentially continuously.
Really their biggest risk is total compute costs falling too quickly or poor management.
I'd be surprised it that was the case. How many tokens is the average user going through? I'd be surprised if the avg user even hit a 1m tokens much less 20m.
Even for regular old 4o: You’re comparing to their API rates here, which might or might not cover their compute cost.
Voice mode is around $0.25 per minute via API. I don't use that much, but 3 minutes ago per day would already exceed the cost of a ChatGPT Plus subscription by quite a bit.