(Each chapter of Extropia’s Children can stand alone, but see also Chapter 1, “The Wunderkind”; Chapter 2, “This Demon-Haunted World”; Chapter 3, “Extropicoin Extrapolated”; Chapter 4, “What You Owe The Future”; and Chapter 5, “Irrationalism.”)
i. Dark self-proclaimed enlightenment
In November of 2013, Klint Finley, then at TechCrunch, published a remarkable and seminal piece entitled “Geeks for Monarchy: the Rise of the Neoreactionaries.” [Disclosure: I wrote a weekly column for TechCrunch through the 2010s.] He describes
a community of bloggers ... Neoreactionaries believe that while technology and capitalism have advanced humanity over the past couple centuries, democracy has actually done more harm than good. They propose a return to old-fashioned gender roles, social order and monarchy. You may have seen them crop-up on tech hangouts like Hacker News and Less Wrong.
Yes, that's right: monarchy. Not all neoreactionaries are openly fascist, but as “an ideology obsessed with both the mechanics of power and autocratic governance, and with aesthetics”, even calling it “fascism-adjacent” seems pretty generous. Finley traces their history back to 2007, correct with respect to that particular term.
But you can go further. An essay which “discusses the reaction of transhumanists to a small group of neo-Nazis who have attempted to attach themselves to the transhumanist movement” was deconstructed at length by the extropians mailing list in 2002. (Yudkowsky said, correctly, “[This was] a lone idiot who tried to join a webring that shut down and reincorporated with a new charter rather than accept him as a member ... And just to clarify: if there were more than one idiot, it would be a larger problem, but it still wouldn't be a part of transhumanism.”)
Finley goes on:
Neoreactionaries believe "The Cathedral" is a meta-institution that consists largely of Harvard and other Ivy League schools, The New York Times and various civil servants [...] the Cathedral regulates our discussions and enforces a set of norms as to what sorts of ideas are acceptable [...] So what exactly is the Cathedral stopping neoreactionaries from talking about? Well, the merits of monarchy for starters. But mostly, as far as I can tell, they want to be able to say stuff like "Asians, Jews and whites are smarter than blacks and Hispanics because genetics" without being called racist. Or at least be able to express such views without the negative consequences of being labeled racist."
There's nothing inherently monarchist about accusing “The Cathedral” of existing. (And if it does, that means there already is a kind of loose monarchy, so the neoreactionaries' real complaint is just that they're not part of it.) Many progressives, e.g. Noam Chomsky and (former extropians-list member) Charles Stross, have proclaimed essentially the same accusation. The semireasonableness of the diagnosis of “the Cathedral,” as opposed to the vileness of the neoreactionary prescription, may be what helped them gain a foothold on LessWrong.
ii. Neoreactionarationalist
That LessWrong became home to the neoreactionaries — who were, in turn, only a small fraction of its conversations — seems fairly well accepted. What's more, “A frequent poster to LessWrong was Michael Anissimov, who was MIRI's media director until 2013. Last year, he penned a white nationalist manifesto. Contacted for comment, a media representative at MIRI said, "I'm disturbed to hear that he's become a white nationalist—I don't believe any of the staff who worked with him were aware of this."”
(Yudkowsky himself, to be clear, has been consistently anti-fascist and anti-authoritarian through his life, although he tends to avoid any ethical stance which could be interpreted as political, saying “politics is the mind-killer.”)
How did a custom-built online home for rational thinking attract a tiny but thriving coterie of, well, far-right authoritarian racists? The obvious objection is that there's nothing irrational about wanting to be oppressive king of the world, that's just evil, the two are orthogonal axes. But that's too simplistic. Ethics have been important to rationalism since day one.
Their focus on existential risk is a more obvious parallel. As a commenter puts it:
I think I have grasped the link between [LessWrong] and [Neoreactionaries]. It's a mixture of having something to protect and extrapolating trends. Whereas singulatarians look at exponential trends in computing, extrapolate and see a future where some form of superintelligence will surely come to dominate, worrying that human values could be destroyed, the neoreactionaries look at the trends of memes and genes, extrapolate the exponential growth, and see a future where their ingroup and values are massively outnumbered, which can be a death sentence in democracy.
This does sound plausible. But another, far simpler explanation comes to mind; that the neoreactionaries made LessWrong their home mostly because LessWrong opened its metaphorical doors to them and made them welcome. This was equally true of the site that became the new home to most rationalist discussion, after the great diaspora: Scott Alexander's Slate Star Codex.
iii. We must tolerate everything, especially intolerance
Scott Alexander is a Bay Area psychiatrist and a writer capable of absolutely magnificent, incisive, soulwrenching work ... with whom I often strongly disagree. Some of his arguments are truly illuminatory; some betray the intellectual side-stepping of a very smart person engaged in rationalization and/or unwillingness to accept the rest of the world will not adopt their worldview. (Many of his critics, unfortunately, are inferior writers who misunderstand his work, and furthermore suggest it’s written in bad faith, which I think is wholly incorrect.) But in fairness 90+% of humanity engages in such rationalization without even worrying about it. Alexander does, and challenges his own beliefs more than most.
Yudkowsky's Sequences brought a small but passionate coterie of true believers to LessWrong, along with a larger nimbus of interested people. Starting in 2013, Slate Star Codex, Alexander's blog — a bit like LessWrong, but a) better written b) focusing on “science, medicine (especially psychiatry), philosophy, politics, and futurism,” a far more generally appealing set of subjects — brought many of those people to his own discussion forum, and then, over time, a larger and more influential audience than LessWrong had ever had. Oh yes. And the neoreactionaries too.
Alexander is so not one of them that he wrote an immensely long anti-neoreactionary FAQ. But he generally tolerated their presence, despite their manifest desire for authoritarian intolerance. The sympathetic interpretation is that he believed in the virtues of debate, that good arguments would drive out bad ones, that the correction for bad speech is better speech. The unsympathetic one is willful denial of the Paradox of Tolerance — “we should tolerate everything except intolerance” — making him “a dangerous sucker.”
In early 2021 the New York Times wrote a long and bad article about Alexander, which at the time prompted him to delete Slate Star Codex in its entirety. (It has since returned as Astral Codex Ten.) This kerfuffle did, however, provoke a superb and nuanced New Yorker piece, highlights of which are worth quoting;
[Alexander] more than anyone has defined and attempted to enforce the social norms of the subculture, insisting that they distinguish themselves not only on the basis of data-driven argument and logical clarity but through an almost fastidious commitment to civil discourse [...] the British journalist Tom Chivers notes with an avuncular warmth that most rationalists seem constitutionally incapable of ordinary small talk
[His supporters include] the cognitive psychologist Steven Pinker, the economist Tyler Cowen, the social psychologist Jonathan Haidt, the cryptocurrency oracle Vitalik Buterin, the quantum physicist David Deutsch, the philosopher Peter Singer, and the OpenAI C.E.O. Sam Altman. Much of the support Alexander received was motivated simply by a love for his writing.
Alexander began as a contributor to LessWrong, and the center of rationalist gravity followed him, in 2013, to Slate Star Codex [...] Alexander strives to set an example as a sensitive, respectful, and humane interlocutor, and even in its prolixity his work is never boring; the fiction is delightfully weird and the arguments are often counterintuitive and brilliant. He has frequently allowed that a previous position he's taken is wrong [...]
Still, the rationalists' general willingness to pursue orderly exchanges on objectionable topics, often with monstrous people, remains not only a point of pride but a constitutive part of the subculture's self-understanding. They have given safe harbor to some genuinely egregious ideas, and controversial opinions have not been limited to the comments.
LessWrong, Slate Star Codex / Astral Codex Ten, and rationalism in general are notable for their openness to all forms of innovative and unusual thinking, and their willingness to casually reject the status quo. As is usually the case — see also the classic punk scene, which inevitably soon attracted Nazi punks — such openness and iconoclasm is mostly admirable, but not without its dark side. Consider e.g. the cultlike activity described in Chapter 2, and other weird outcomes such as outright ecofascism. If you further suggest the central problem your community is built upon is an x-risk, so dire that it may justify any solution, so any kind of thinking is open to serious consideration ... your community will inevitably become flypaper for fascists.
Going back to the punk analogy, there's a famous online anecdote:
I was at a shitty crustpunk bar once getting an after-work beer … So the bartender and I were ignoring one another when someone sits next to me and [the bartender] immediately says, "no. get out." And the dude says, "hey i'm not doing anything, i'm a paying customer." and the bartender reaches under the counter for a bat and says, "out. now." and the dude leaves, kind of yelling. I asked what that was about and the bartender was like, "you didn't see his vest but it was all nazi shit. Iron crosses and stuff. You get to recognize them. you have to nip it in the bud immediately. It's always a nice, polite one. And you serve them because you don't want to cause a scene. And then they become a regular and after awhile they bring a friend. And that dude is cool too. And then THEY bring friends and the friends bring friends and they stop being cool and then you realize, oh shit, this is a Nazi bar now. So you have to shut them down and ignore their reasonable arguments because their end goal is to be terrible, awful people.
The uncharitable interpretation of rationalism’s attitude towards neoreactionaries, which seems not unreasonable, is that its de facto leaders were metaphorical bartenders whose openness to unusual arguments and beliefs led them to accept “terrible, awful people” whose justifications were only ever a bad-faith cover for their terrible awfulness. As my Twitter friend Jonathan Korman says: “do not ask what they ‘really’ believe; ask what they would justify with their ‘belief’.” In other words, it's morally important to judge (self-described!) oppressive authoritarians based on what (they openly say!) they would do if empowered, rather than engage with their theoretical rationalizations. The punk bartender in the story understood that the general theory “serve every polite customer” applied almost always, but under a very specific and unusual set of circumstances, the outcome of putting that theory into action would be highly undesirable.
I realize I'm belaboring a fairly basic point here, but that's because this kind of Pascal's Wager consequentialism seems pretty alien to rationalists. They rarely seem to preemptively think, “what if I’m wrong?”, much less hedge against such an outcome. In fairness, this is true of the overwhelming majority of humanity! …but relatively few people portray their philosophy as The Optimal Way Of Thinking.
It's certainly not fair to say the neoreactionaries ever “took over” LessWrong or Slate Star Codex. They were only ever a small minority of either. But they certainly made a lot of noise, and tainted both; and as Anil Dash once put it, “If your web site is full of assholes, it's your fault.”
Of late Alexander has been writing as well and prolifically as ever, including about existential AI risk, about which he too now seems extremely concerned. In our next chapter we will end this series, inevitably, with an attempt to evaluate answers to the question: “how bad is this threat of human extermination that everyone keeps talking about? What is the fabled AI risk?”