https://news.ycombinator.com/reply?id=13636150&goto=threads%...
why?
Frankly i don't feel it's productive or rational to attach the name of a biblical villain to new technology.
That reply, while informative, continues with the weighted religious terminology. It may find better reception here couched differently.
> Frankly i don't feel it's productive or rational to attach the name of a biblical villain to new technology.
Well, frankly, I disagree. Humans have an inherent blind spot when it comes to complex systemic forces. We tend to imagine them as weak and irrelevant. Reframing them as villains seems to be necessary to understand their power and reach.
And, by the way, I had not considered the Moloch article as a direct re-framing of a problem until you put it as such. I must say thinking about it in that light I find humanizing `complex systemic forces` a rather novel transformation and quite useful. Even having read the article a few times, I hadn't thought to describe it as such. But morphing a problem from one fairly inscrutable set of phenomena to be a villain allows us to use a different set of mental tools to tackle understanding the problem.
Typically I had though more restrictively about such transformations, for example, viewing a sound's waveform graphically can be illuminating in a certain sense (transforming audio-temporal, to visual-spatial). The biggest issue with the toMoloch transform is that the conversion process is obviously going to be significantly more noisy and provide the author the ability copious amounts of wiggle-room to steer the reader towards their own conclusions. But just expressing the facets of the problem and making its existence more well known has a lot of value. Anyhow thanks for helping me see an article I have gotten quite a bit of insight out of in another way.
Most things I disagree with SSC on seem to be general rationalist beliefs and may also be found on places like LessWrong. These views are usually expressed less directly, and sometimes in comments.
For example, SSC and rationalists in general attribute very high value to IQ. SSC has some posts relating to ability, genetics, and growth mindset that I find very good:
http://slatestarcodex.com/2015/01/31/the-parable-of-the-tale...
http://slatestarcodex.com/2015/04/08/no-clarity-around-growt...
But, while I mostly agree with both of those series, the continual claim that IQ is the best thing since sliced bread, that it's everything, correlates with everything, and is necessary for someone to reach certain heights, is a something that I find to be more dogmatic than rational. I think the IQ-is-everything model is too simplistic, and rather self-fulfilling, and if you have a lot of patience, you can extract my position on ability development from this old post: https://news.ycombinator.com/item?id=12617007
> And, by the way, I had not considered the Moloch article as a direct re-framing of a problem until you put it as such.
To be fair, I'm not sure if Scott Alexander meant it that way. There was a related post on the Goddess of Cancer, where I think the reframing part was mentioned. But I already believe that Moloch is a manifestation of a wider process, so the issue of explaining to someone how a blind process can have so much power is not new.
> The biggest issue with the toMoloch transform is that the conversion process is obviously going to be significantly more noisy and provide the author the ability copious amounts of wiggle-room to steer the reader towards their own conclusions.
I don't know that it really introduces any more significant noise than anything else. We're already surrounded by so much noise, and I would argue much of it is from the aforementioned process itself, that better means are needed than hoping that a given transformation was accurate anyway. I.e., can we make predictions from the concept of Moloch? It looks to me that we can.
Generally, information needs to be routed to the right subsystems. Humans have a few subsystems that are really good at identifying an adversary or assigning blame. But they don't have any good subsystems to examine the situation itself unless they're already above it, nor can they assign blame to the situation, as they perceive it as neutral and inert. I would say the extreme informational loss from the inability to process effects of systems and situations is so much larger than the added noise that the transformation absolutely needs to be done.