Three situations which enforce or foster delusional beliefs
Sophisticated paths to stupidity. Possibly implicitly autobiographical
—Arrested Development
There’s actually a lot more than three. But I want to highlight three of them here.
Steelmanning and Goodhart’s Law
Steelmanning is, on its face, meant to test arguments. Can you articulate the best reasons against your position? Unfortunately, the purpose to which it is usually applied is to demonstrate strength, or to rationalize. I don’t steelman your position because I respect it. I do it to show off how smart and unassailable I am. It takes a heavily reinforced dam to hold back the cold waters of reality, and steelmanning is often a component of such a dam. I think for the sake of intellectual honesty it is a necessary practice, but it is increasingly seen as sufficient for credibility, perhaps misused in a sort of rhetorical form of Goodhart’s law–a measure which has become a target unto itself. “If I can steelman the opposition, I am being reasonable or correct.”
Having loud and dumb critics and the Fool’s Transitive
At the end of Game of Thrones, would-be savior and liberator Daenerys Targaryen loses her mind and burns down a city. I think one factor that contributed to this is that she is completely convinced of the inherent justice of her cause, because many who support her are worshipping sycophants, but also, importantly, because the vast majority of those who oppose her over the course of her life are genuinely dumb and evil people–her abusive brother, the slave traders of Essos, paranoid and tyrannical Cersei Lannister, et cetera. This is a delusion-breeding brew of feedback, because it teaches you that positive reinforcement is legitimate and negative reinforcement is illegitimate or easily dismissed. What is subtle about this is that you are getting not only feedback, but feedback about feedback. Negative feedback can be disregarded because its source is not to be respected. It teaches you, in effect, that your cause is inherently correct, because those who oppose it the loudest are so obviously incorrect. The wounds of a friend may be better than the kisses of an enemy, but if you primarily experience opposition in the form of bad-faith wounding, you learn to disregard any form of criticism. I think that much of what we call “polarization” is driven by this dynamic (both left vs. right and elitism vs populism), which is itself a variation on the fool’s transitive: “The enemy of my enemy is my friend.” “My enemy is bad or dumb, so I must be good.”
Stepping on the Necks of Idiots and the Dunning-Kreuger effect
If your project is a corrective to or rebellion against previous failed or failing projects, you can get similar effects as if you have loud and dumb critics. For example, a lot of people in tech think that because they recognize and “understand” the problems of social media or the “move fast and break things” ethic of early(er) Silicon Valley, they are properly equipped to avoid making the same mistakes; that ability to diagnose planks in the eyes of others protects you from planks of your own. But the deeper problem of Mark Zuckerberg, for example, was not that he didn’t establish a “Facebook Ethics Division” or whatever in 2005. The problem of Zuckerberg is that he was experimenting on humans. This project is, unfortunately, intrinsic to any sort of technological innovation. But people convince themselves they are doing it carefully merely because they are aware of some of the problems of their predecessors. I’m going to call this “stepping on the necks of idiots,” in contrast to “standing on the shoulders of giants” (as in, “if I have seen further, it is by standing on the shoulders of giants” of Newton and others). Successors of failed projects often gain a sort of pseudo-insight or -foresight by standing on the necks of the idiots who came before them. Of course, necks are a much less stable place to stand than shoulders, and this generation of tech gods are going to create at least as many problems as the previous one. Sometimes, studying history creates enough illusory confidence that it dooms you to repeat it, a sort of cousin of the Dunning-Kreuger effect. Another example of this may be the effective altruism (EA) movement, whose initial inspiration was looking at the failures of charity projects which judged their effectiveness merely by how much money they raised and gave away, rather than actual impact. EA rightfully recognized that this was a bad approach, but has since fallen into many problems of its own. I wonder how many of the ongoing and potential future issues with EA (see the Sam Bankman-Fried debacle) are due to hubris fostered by a justified-but-myopic critique of its predecessor.


