What's wrong with this blog post is that the author does not seem to have read any books at all on the subject, which IMO is fairly irresponsible considering how much there is to read about it and how critical it is that "cult behavior" is widely understood in order for society to progress. Seems to be off the cuff musings about the topic, alarmingly even considering that some "cult leaders" are "good" (no) or that they can be "reformed" (irrelevant, but also REALLY doubtful considering what it means to be a cult leader; cult leaders generally remain such leaders well after they've been sentenced to prison for decades, their followers show up to visit, etc., there is no "reform" here, sorry), and overall a lot of muddy, uninformed and made-up thinking that will only get more people into cults.
I imagine we've all had the experience of meeting such magnetic personalities, but probably only a tiny percentage have met an actual cult leader.
I think that most of our 'great leaders' (good one's) that we look up to, flirt with this mindset for periods of time in their lives.
Not all cult leaders maintain their disposition after being arrested.
The author was fairly concise actually in identifying core attributes, and especially with the meta cognition bit.
I still find them fascinating, because they tend to exist in interesting environments. It's like going to Africa, but not wanting to see any lions.
If you wouldn’t mind sharing what you found most interesting while reading the books you mentioned.
However, it's not true that cult leaders are always grandiose. They can be strategically vulnerable too!
When confronted -- especially by someone too smart to be fooled by their facade -- they can take that critic aside, in a private setting, and suddenly express vulnerability, maybe even fragility. The critic is taken aback, maybe even worries that they've gone too far, but also feels honored to have become a confidant of the leader. The critic may now even feel compelled to help cover for the leader's failings!
But this, too, is an act. Remember, as Sasha said, such people are always being strategic, and strategic vulnerability is part of how they operate.
(1) All pathos are healthy behaviors "gone wrong" - wrong time, or, wrong place, or (mostly), wrong magnitude.
(2) I wondered for a very long time: what's the difference between being convinced, and being manipulated? Stumbled across something on the 'net that finally gave an answer: Manipulation begins with diminishment, and (bringing it back to cults), "isolation" is one of the powerful forms of manipulation.
(3) Recently been thinking about "responsibility" in terms of the physics of a 'bounce' -
Drop a ball on sand, it goes thud; the force involved is the weight of the ball. Drop a bouncy ball on concrete, the force is ~2x (stopping the downward motion, then enough for the upward motion).
When someone comes to you and says: "This thing you did had this negative impact on [me/them/us]", if someone rejects any possible responsibility for it, that's a "bounce". It ends up looking like any of the forms of "pushing your reality onto others" - "you do this to me", gaslighting, etc etc etc.
The force in both scenarios is exactly the same. In the first the force goes into displacing the sand while in the second - since the concrete is a rigid lattice - the force goes into deforming the ball, which causes it to bounce back due to its elasticity. Objects with no elasticity (like another piece of concrete) will not bounce.
Beware of physical metaphors.
AFAICT - The force to bring the ball to rest is exactly the same. As you point out, either the sand or the ball's structure (or the ground's structure, such as a trampoline) absorbs that force. After that however, one ball remains at rest, and the other ball accelerates back up. The force producing that acceleration is exerted (keeping in mind newton's third). If the ball returns to 80% of the original height, then that's 100% of the force to bring it to rest (same in both situations), and then an additional 80% of the force to re-"throw" it, which is only present in the bounce. So in a "thud" there's 100% of the force, and in the "bounce" there's 1xx% of the force.
There are plenty of examples from streamers to serious programmers but the most common is probably the benevolent dictator for life personalities for numerous software projects.
Can you provide any examples from either of these?
I feel like you may be confusing "people with strong opinions" with "narcissistic cult leader."
I tend to be a "only true wisdom is in knowing you know nothing" sort of guy, but when I run into someone who isn't like that, who seems to have it all figured out and says it with conviction, it definitely pulls at me. It does one of these:
1. Makes me think that I've overcomplicated it
2. Makes me think I have to convince him he's wrong (which is very hard to do unless you're as laser-focused as he is and he's engaging in good faith)
3. Makes me think it's not worth the effort because of what the other two options are
If you follow 1 or 3, you're ceding the argument. 2 is hard to do well.
Reminds me of "One weird trick." Really they're two specimens of the same phenomenon. Some people essentially spam and clickbait the world with ads for themselves.
I think politics only works for people with these traits.
Hogan, J., Hogan, R., & Kaiser, R. B. (2011). Management derailment. In S. Zedeck (Ed.), APA Handbook of Industrial and Organizational Psychology (vol. 3, pp. 555–575). American Psychological Association.
Tourish, D. (2018). Dysfunctional leadership in corporations. In P. Garrard (Ed.), The Leadership hubris epidemic: Biological roots and strategies for prevention (pp. 137–162). Palgrave Macmillan.
Tourish, D. (2013). The dark side of transformational leadership: A critical perspective. Routledge. Chapter 3: Coercive persuasion, power and corporate cults
Alvesson, M., & Blom, M. (2019). Beyond leadership and followership: Working with a variety of modes of organizing. Organizational Dynamics, 48(1), 28–37.
However, as was pointed out in the post it is never just a cult leader, there are always those who are complicit, and additionally there are those who just want someone to follow, which is really what gives cult leaders their "power" or their "authority". This is where the concept of relational leadership is interesting:
Hughes, R. L., Ginnett, R. C., & Curphy, G. J. (2014). Leadership: Enhancing the Lessons of Experience. McGraw-Hill Higher Education. Chapter 1: What do we mean by leadership
Haslam, S. A., & Reicher, S. D. (2016). Rethinking the psychology of leadership: From personal identity to social identity. Daedalus, 145(3), 21–34.
Cunliffe, A. L., & Eriksen, M. (2011). Relational leadership. Human Relations, 64(11), 1425–1449.
I don't think there is necessarily anything wrong with the traits described here, but rather there can be a problem with the way certain people abuse their power once they have it. The burning question that bugs me in life, is why do so many humans just follow the leader, even when the leader is a total piece of shit.
That being said, I think the author is talking about cult of personality types. People with a personality that gears towards being the leaders of a cult, or getting a cult-like status. That is how it reads to me anyway.
I haven't met Elon, but the impression I got from videos is very different to Jobs. He has a cult following indeed, but is it really because his magnetism or PUA tricks like in the article?
One thing that gets me about the way this author talks is it’s condescending. As though he’s an enlightened observer staring down over his spectacles at this type of person. (Like the final quote about cult leaders being mundane or boring…) Sorry to say, but that person you’re talking down on is probably way more clever and motivated than you are. He’s putting you in the bucket not the other way around!
But the thing that fascinates me is how personalities evolved in general. Like why do people have these set ways of being?
The apparent answer is that personalities evolved as a symbiotic trait. If I have a few people in my village: one’s an asshole to defend from enemies, one’s passionate and emotional to rear children, one’s a narcissist who wants to unite and lead us (under him), one’s a thinker who will improve our hunting and killing tools, and a few people just don’t think that much and view life as hard and are willing to “just go along with things”…
Before you know it we have a mini society. A village of people who work together automatically bc it’s just who they are and it works for (most of) them
So the question is: are cult leaders actually something that thousands of years ago was a benefit? Has society changed so that now they just don’t fit like they otherwise would in a healthy village? Or do parasitic personalities also evolve that have always been a detriment to the rest of us?
A substantive discussion of this topic isn't going to work if it veers into dissecting particular personalities.
Keep in mind that a lot of tech-adjacent writing these days is just AI debates (and this is why he is not engaging with the actual cult literature or providing examples); this is not the place to debate the object-level arguments, but I will say I disapprove of Chapin writing like this, and not owning up to the real purpose of this essay and the Bay Area dynamics he's criticizing. It is deceptive in precisely the way you inadvertently illustrate.
* His use is wrong, incidentally, both in the original abstruse decision-theory sense of the phrase as coined by Yudkowsky, ironically enough, and in the vulgarized sense of 'you should ignore small probabilities of very bad things' (because we are now far beyond some 'small' probability of AI, and AI risk is now considered so probable people like Geoff Hinton are quitting their jobs so they can speak out about it https://www.lesswrong.com/posts/bLvc7XkSSnoqSukgy/a-brief-co... )
Yudkowsky was definitely a case I thought of, but not the specific target of this article. AND, Yud is actually an example of one of these personalities that I think is probably net positive! Even if I have a lot of objections to the way he's presented himself and his specific arguments, I think he's doing a good job with moving the Overton window of taking AI seriously.
I grant that I probably do not fully understand Pascal's Mugging. But this tweet was a subtweet of someone who said that she was working on AI because an aligned AI would definitely end factory farming; whether or not this is true, this seems like the kind of thinking that will drive people crazy (all concerns must be suborned to the One Great Cause.)
It's made me very sad to see extremely smart people, who I once looked up to and really found very interesting (you're included in here!) fall victim to the irrational fear of AI spurred on by Eliezer. You can create as much literature as you want, and you can create as many hypothetical scenarios as possible, but it's not going to change the fact that dedicating your life into controlling AI so that you can "logically" justify your own life is nonsense.
It's not that I don't get it. I just disagree. I don't think AI will create a science fiction horror show to end all of humanity. I don't think AI will cause humans to stop being humans. I don't think AI will create the end of the world. Yes, I am not 100% sure. There is a 0.01% chance of this all happening. But that doesn't mean that we need to pay attention to it just because multiplying 0.0001 by 1,000,000 yields a big number. I think basing your morality and existence on trolley problems and mental experiments is abhorrent. And no, I can't give you an exact spelled out reason as to why. Do I have to justify basic human morality?
You're an extremely smart person. I love your blog posts, and I love the depth and time you put into them. It's incredible how disappointing it is for me to see you become another "rationalist" obsessed with a fantasy of AI doom.
Edit for your edit: Someone who is wealthy and has nothing to lose quitting his job to join the cult of AI doomerism is, in no way, proof that AI xrisk is meaningful. You're using someone else coming to a conclusion as a reason you are correct. I trust you are smart enough to know why this is awful logic.
I found the last section shocking. Wanting to joust, jealous of this personality type? That’s.. really strange.