How did Rationalism lead to the creation of a murder cult?

I apologize in advance for this post, which may not interest many of you, or may be too meandering and/or inconclusive for others. Some pieces I write for The Torment Nexus involve a topic where I have a clear point of view that has been established over time — copyright and artificial intelligence, the absurdity of the TikTok ban, etc. This is not one of those times. In this case, I'm trying to think through a particular subject, and writing about things helps me think through them in a more logical (one might even say rational) way. You are welcome to join me on this journey, but obviously you don't have to!
The subject or subjects of this newsletter have been sort of brewing in the back of my mind for some time, but they started to crystallize when I read about what is known as the "Zizian cult," a group of what appear to be mostly trans women who have adopted n extreme version of Rationalist philosophy, following the writings of someone who goes by the nickname Ziz, for reasons that are unclear. The person with that name was born Jack Amadeus LaSota, and some of her followers — she uses female pronouns — have been implicated in four murders: in one, an elderly man had his throat slit, after previously being stabbed by a Samurai sword; in another case, a young woman allegedly killed her elderly parents in Pennsylvania; and most recently, a US border guard was killed during a shootout in Vermont near the US-Canadian border with Quebec.
You may be wondering what any of this has to do with technology, which is allegedly the domain of this newsletter. What's interesting to me about the Zizians is that they seem to have taken a philosophical approach that one often finds among programmers and other Silicon Valley types, i.e. Rationalism, and taken it to its logical — and possibly absurd — extreme. Rationalism has been around since Plato's time, but modern rationalism with a capital R is a relatively recent invention, and was popularized by Eliezer Yudkowsky, who runs a site called LessWrong and co-founded the Machine Intelligence Research Institute. Scott Alexander Siskind, a practicing psychiatrist who writes a blog called Astral Codex Ten, is a prominent member of the community, and Elon Musk, Peter Thiel and others have been or are aligned with it.
(Note: I didn't mention it in the original version of this post, but there is some evidence to suggest that Luigi Mangione, who shot and killed the CEO of United Healthcare, was possibly a member of what Siskind has called "the grey tribe")
Rationalists often talk about how human beings are trapped by what are called "Bayesian priors," a reference nto the work of 18th-century statistician Thomas Bayes. Non-Rationalists might call Bayesian priors beliefs or superstitions. The implication is that smart people can get beyond these and move forward powered solely by rational thought, and reasoning based on "first principles." Alex Gendler described it as "the result of people with STEM educations attempting to tackle questions that had long been the purview of the humanities, guided by a stubbornly autodidactic conviction that definitive answers could be reached through a rigorous application of logic." Some Rationalists also believe that AI is an existential risk to humanity, an argument that was popularized by Yudkowsky. Members of this group are often called AI "doomers."
Note: In case you are a first-time reader, or you forgot that you signed up for this newsletter, this is The Torment Nexus. You can find out more about me and this newsletter in this post. This newsletter survives solely on your contributions, so please sign up for a paying subscription or visit my Patreon, which you can find here. I also publish a daily email newsletter of odd or interesting links called When The Going Gets Weird, which is here.
Transhumanism and longtermism

There's an alternative group that believes even if AI is a danger to humanity, we need to help make it happen, and then things will somehow get better (I am paraphrasing in possibly unfair ways). This is sometimes known as "effective accelerationism" or e/acc. Some or all of the beliefs associated with these groups have been referred to as TESCREAL, a term that was coined by former Google AI staffer Timnit Gebru that stands for "transhumanism, extropianism, singularitarianism, cosmism, rationalism, effective altruism, and longtermism." Longtermism has been described as "a radical form of utilitarianism which argues that we have moral responsibilities towards the people who are yet to exist, even at the expense of those who currently do."
The person who goes by the name Ziz was born Jack Amadeus LaSota. Her father is an academic who specializes in artificial intelligence. She has a degree in computer science and worked for a time as a programmer, and then started writing a blog called Sinceriously (archived here). Around the same time, she became involved with a group connected with the Center for Applied Rationality, which is associated with Yudkowsky, and wound up living with another trans woman named Gwen, who would later be involved in at least one alleged murder. Ziz and at least a few other members of the group are also radical vegans, who believe that "flesh eaters" in some cases don't deserve to live.
Reading the Ziz blog is tough going, even if you know some of the background of the Rationalist movement, because it uses a lot of shorthand (the term Basilisk, for example, is a reference to Roko's Basilisk, the idea that a superintelligent AI might torture those who didn't work to make superintelligent AI a reality). In addition to reading the Ziz blog, there's a site called Zizians.info, set up by an anonymous person after CFAR kicked Ziz and her friends out for being dangerous. There's some good background in this Chronicle article, an anonymous Medium post warning about the group's involvement in the alleged murders, and a podcast from TrueAnon called "Evil Gods Must Be Fought." I also found a recent edition of Max Read's newsletter very helpful. Here's Read:
“Why does Rationalism have a tendency to produce cults, or at least cult-like clusters? The basic answer, I think, is that the whole Rationalist program effectively generates potential cult subjects. Here's an excerpted list of personal qualities for Rationalists: "Values curiosity, learning, self-improvement, figuring out what's actually true; will change their mind or admit they're wrong in response to compelling evidence or argument; wants to work collaboratively with others to figure out what's true; is nerdy and interested in all questions of how the world works and who is not afraid to reach weird conclusions." What you have is a person convinced of their own inadequacy, eager for social connection and personal development, and endlessly persuadable.
Being a sociopath is an ethical necessity

At some point, as far as I can tell, Ziz decided that the best course of action she could take in order to prevent a hypothetical apocalypse was to deliberately become a sociopath; in other words, to stop considering what is right or wrong according to conventional morality (believe it or not, this is somehow connected to Ricky Gervais's character on the TV show The Office). This decision and everything leading up to it is described in almost hyper-rational terms. I should also note that a big part of the Ziz philosophy is the idea that through training (including the use of both psychedelics and sleep deprivation) certain people can separate the two hemispheres of their brain into different entities with different personalities and even different genders, and then access them.
In a comment on LessWrong, Siskind describes it this way:
The combination of drugs and paranoia caused a lot of borderline psychosis, which the Vassarites mostly interpreted as success ("these people have been jailbroken out of the complacent/conformist world, and are now correctly paranoid and weird"). Occasionally it would also cause full-blown psychosis, which they would discourage people from seeking treatment for, because they thought psychiatrists were especially evil and corrupt and traumatizing and unable to understand that psychosis is just breaking mental shackles.
So why should we care about some fringe group of weirdos who convinced themselves that by not sleeping for days and taking LSD that they were somehow altering their brain chemistry in ways that would help them prevent the AI apocalypse? Because, as Read describes it, the thought processes aren't really that far away from some of the thinking in other parts of the e/acc or Rationlist communities. In a sense, everything required to become a cult is already present in the Rationalist community (and plenty of other communities as well, I should point out). Yudkowsky wrote in 2007 that "every group of people with an unusual goal—good, bad, or silly—will trend toward the cult attractor unless they make a constant effort to resist it." David Z. Morris wrote in a post that:
"Apocalyptic predictions, asserted with a confidence buttressed by weaponized misuse of ‘Bayesian’ logic, is driving young people insane by convincing them that they must take extreme steps to halt an imagined doom. The source of the conflict is that these bad actors took Yudkowsky’s basic ideas, above all ideas about the imminent destruction of humanity by AI, and played them out to a logical conclusion - or, at least, a Rationalist conclusion. This wave of murder is just the most extreme manifestation of cultish elements that have bubbled up from the Rationalist movement proper for going on a decade now, including MKUltra-like conditioning both at Leverage Research - another splinter group seemingly pushed out of Rationalism proper following certain revelations - and within the Center for Effective Altruism itself."
Magical rationalism

I came across a couple of other blog posts while I was thinking about all this, and some of them did a good job of putting the Zizians and their version of rationalism into a broader context. Fredrik deBoer describes it as part of what he calls:
A certain treacherous animal spirit stalking around, particularly among the young, a yearning for deliverance… and if they have to, they’ll take deliverance through violence. Our culture has erased transcendent meaning and left in its place short-form internet video, frothy pop music, limitless pornography, Adderall for the educated and fentanyl for the not, a ceaseless parade of minor amusements that distract but never satisfy. And people want to be satisfied; they want something durable. They want something to hold on to. They want to transcend the ordinary. We are approaching a new era of cults, of terrorist cells, of mass shooters pouring out of our society like ants from an anthill in pursuit of a discarded candy bar. We are bound to find some folk worship of Shiva, people kneeling in reverence before the bomb like they do on the Planet of the Apes.
Meanwhile, Mark O'Connell wrote in 2017 about the rise of something that he described as "magical rationalism":
"Magical rationalism arises out of a quasi-religious worldview, in which reason takes the place of the godhead, and whereby all of our human problems are soluble by means of its application. The power of rationalism, manifested in the form of technology — the word made silicon — has the potential to deliver us from all evils, up to and including death itself. This spiritual dimension is most clearly visible in the techno-millenarianism of the Singularity: the point on the near horizon of our future at which human beings will finally and irrevocably merge with technology, to become uploaded minds, disembodied beings of pure and immutable thought."
And finally, Nicholas Carr wrote in a recent edition of his newsletter New Cartographies about the recurrence of a belief in the supernatural, something that in his case was triggered by Tucker Carlson's comments that he was attacked by a literal demon in his sleep:
"Contemporary man’s faith in reality, in an objective world defined and limited by its material qualities, is breaking down. We’re seeing a rebellion against the constraints of materiality, one manifestation of which is an increasing openness to supernatural visions. People have always, of course, been open to such visions. What’s different now is that the supernatural is moving from the personal sphere, where it had been sequestered by science, back into the public sphere. We viewed the information age as an age of reason, governed by the scientist and the mathematician. We’re now coming to realize it’s altogether different — an enchanted age, governed by the myth-maker and the mystic."
What does all this mean? I confess that I don't know. Does Rationalism appeal to people who are willing to pursue an intellectual goal all the way into insanity? Perhaps. Of course, plenty of people — whether they call themselves rationalists or not — have pursued unusual or even laughable goals into insanity, like the guy who shot up Comet Ping Pong because he thought Hilary Clinton was running a child-abuse ring from the basement. Otherwise normal Christians have done unspeakable things to their fellow human beings because someone misunderstood the text in an old book. As Neal Stephenson put it in Snow Crash: “We are all susceptible to the pull of viral ideas. Like mass hysteria. Urban legends. Crackpot religions. No matter how smart we get, there is always this deep irrational part that makes us potential hosts for self-replicating information.”
Got any thoughts or comments? Feel free to either leave them here, or post them on Substack or on my website, or you can also reach me on Twitter, Threads, BlueSky or Mastodon. And thanks for being a reader.