This tendency towards maximum comfort reminds me of many things, e.g., food delivery apps: they provide a near-seamless experience, at the loss of small encounters with humans that may be really important for a happy life.
"Oh, she says, well, you’re not a poor man. You know, why don’t you go online and buy a hundred envelopes and put them in the closet? And so I pretend not to hear her. And go out to get an envelope because I’m going to have a hell of a good time in the process of buying one envelope. I meet a lot of people. And see some great looking babies. And a fire engine goes by. And I give them the thumbs up. And I’ll ask a woman what kind of dog that is. And, and I don’t know. The moral of the story is - we’re here on Earth to fart around. And, of course, the computers will do us out of that. And what the computer people don’t realize, or they don’t care, is we’re dancing animals. You know, we love to move around. And it’s like we’re not supposed to dance at all anymore."
At the same time, we're talking about AI essentially ending all relationships. Part of the value in a therapist is that they're not perfect, emulating some of the relationships you have in real life. If AI ends up being the perfect therapist, then it's not preparing you for real life, it's sort of preparing you for narcissism.
I feel both like this touches on something important about the human condition, while simultaneously feeling like the old "Back in my day, we talked to people and not robots"