>You can't average out the userbase like that because the individual usage of the service varies wildly
Yes you can. This is how Meta, Google et al report their numbers. Obviously I'm not expecting each user to bring in exactly $8. The point is that the value they need to extract from their free users to be profitable is very small and very achievable. You and many people here have completely incorrect notions on how expensive inference is. Inference is cheap, and has been for some time now.
>and advertising revenue is directly tied to amount of usage.
Open AI with 800M weekly active users processes 2.6B messages per day. Google with ~5 billion users processes ~14 billion searches per day.
>This too is skewed by averaging with users who barely use the service.
No it's not. Inference is just not that expensive. Model costs have literally crashed several orders of magnitudes in the last few years. Sure, in 2020, this would be a very serious concern. In 2025, it just isn't.