Which also begs the question; how much money would Apple save by using Nvidia for everything? Probably not much since they don't have to pay margins on Apple Silicon bought for themselves. But I suspect there is a literal monetary cost to bruteforcing an Nvidia-scale server network with weaker hardware.
They said the models can scale to "private cloud compute" based on Apple Silicon which will be ensured by your device to run "publicly verifiable software" in order to guarantee no misuse of your data.
I wonder if their server-side code will be open-source? That'd be positively surprising. Curious to see how this evolves.
Anyway, overall looks really really cool. If it works as marketed, then it will be an easy "shut up and take my money".