It's cheap and easy to make fun of the lesswrong community as a cringy cult of AI-obsessed neckbeards. And to be fair, the writing style on LW tends to support that impression. But I've found that most of the actual people within the rationality/AI safety/effective altruism communities actually don't fit that stereotype at all.
I consider EA separate but related, and it definitely qualifies as staking out a superior position within the constraints of liberal morality.
But then, as a lesswrongy person, maybe it's me, maybe "feeling superior" or whatever is just normal to me :shrug:
(But my actual theory is, it's probably just geographical.)
As a specific example, I made a comment to my roommate last winter about how I thought his girlfriend's hyper caution around COVID was limiting my personal freedom. I realized that I had been crass and apologized to him, but he told his girlfriend anyways and it caused a great deal of tension between the three of us. My own girlfriend told me I should apologize to her. I believed I had nothing to apologize for since I hadn't said anything to her directly, I didn't believe my roommate should have repeated the comment to her in the first place, and I had apologized to him for it already. My girlfriend gave me reasons why an apology was in order, though, and I assigned a lot of weight to her reasoning since I know her to be a more sensitive and emotionally intelligent person than myself. I was able to let go of the belief in my own righteousness and write a heartfelt apology which did wonderfully to mend the relationship.
A previous version of me would have clung to the belief that I was in the right, and either not apologize or write a half-assed apology that would do nothing to fix the situation. The current version of me which strives to be rational was aware of my own biases, recognized that my internal map may not match the territory, and was willing to update based on the evidence from my girlfriend's greater authority on emotional matters.