I think something like this could make sense as a place to run large machine learning models locally. I don't need a server to run my browser or my apps, they run just fine locally. But a server to run large language models, Stable Diffusion, Whisper voice recognition, etc would be useful, as these types of models can run
much faster at higher quality on a beefy GPU than they ever will on a phone.
The endgame of these models is an agent that knows practically everything about me and can perform tasks on my behalf, which I would really prefer to live in my house running on hardware I own, rather than in a data center under someone else's control.