Large AI models have tight resources requirements. You physically can't use X billion parameters without ~X billion ~bytes of memory.
It makes complete sense to have these 3 "tiers". You have a max capability option, a price-performance scaling option, and an edge compute option.