Another use case is multiparty computation. Three people wish to compare some values without a risk that anyone will see the combined data. TC can do this with tractable compute overhead, unlike purely cryptographic techniques.
Observe what this means for P2P applications. A major difficulty in building them is that peers can't trust each other, so you have to rely on complex and unintuitive algorithms (e.g. block chains) or duplication of work (e.g. SETI@Home) or benign dictators (e.g. Tor) to try and stop cheating. With TC peers can attest to each other and form a network with known behavior, meaning devs can add features rather than spend all their time designing around complicated attacks.
These uses require you have a computer that you do trust which can audit the remote server before uploading data to it. But you can compile and/or run that program on your laptop or smartphone, the verification process is easy.
But exactly because TC is general it doesn't distinguish based on who owns the machine. It doesn't see your PC as morally superior to a remote server, they're all just computers. So yes, in theory a remote server could demand you run something locally and then do a HW remote attestation for it. In practice though this never happens anymore outside of games consoles (as far as I'm aware), because most consumer devices don't have the right hardware for it, and even if they did you can't do much hardware interaction inside attested code.