The Ruffian

The Ruffian

Share this post

The Ruffian
The Ruffian
Rationalist Cults of Silicon Valley

Rationalist Cults of Silicon Valley

Innovators and Cranks

Ian Leslie's avatar
Ian Leslie
Aug 16, 2025
∙ Paid
103

Share this post

The Ruffian
The Ruffian
Rationalist Cults of Silicon Valley
8
19
Share
Six portraits arranged in two rows of three.
Some Zizians. Ziz LaSota is top left. Via NYT/AP.

Catch-up service:
The Data Says We’re Doing Fine So Why Are We Furious?
How Humans Got Language
The Diderot Effect
27 Notes on Growing Old(er)
Donald Trump and the End of Power
5 Reasons There Won’t Be an AI Jobs Apocalypse

Foremost among San Francisco’s tech subcultures is a group devoted to improving the human capacity to think clearly. The rationalist community is interested in how to compensate for the systematic cognitive errors to which the human brain is prey, with the aim of improving society. It valorises intellectual fearlessness: rationalists seek to overcome the laziness and cowardice which bedevil conventional minds. Every chain of reasoning must be followed remorselessly to its logical conclusion, even when the conclusion is uncomfortable.

The tech companies are full of rationalists. Peter Thiel, Elon Musk, Sam Altman and others have all been influenced by or adopted ideas from rationalism and its sibling, effective altruism. A central concern of the group is the existential risk posed by superintelligence, or AGI. Many rationalists believe we may be on the brink of a species-ending apocalypse and see themselves as working from the inside to head it off.

Rationalism is an admirable project in many ways. Some of the most acute and exciting thinkers of the last decade have been rationalist or rationalist-adjacent. Thinking about AI safety is good. But push too hard on rationality and you encounter its opposites: fantasy, mysticism, madness. It’s not accidental that a few splinter groups within the rationalist community have devolved into dysfunctional cults (some prefer the more polite term “high-demand group”).

The darkest of these is a group of vegan anarchist transhumanists called the Zizians, which is now connected to six violent deaths. The Zizians are headed by Jack ‘Ziz’ Lasota, currently held in custody in Maryland. Lasota is a male who identifies as a woman.1 She studied computer science at university, and spent a lot of time on the rationalist website Less Wrong debating AI safety and logical paradoxes. After college she moved to the Bay Area with the stated intention of saving the world. She didn’t hold down a job, but made a name for herself as an ostentatiously weird member of the rationalist community, wearing a black cape and claiming her religion was Sith. She became obsessed by a rationalist theory that one should never back down from a confrontation.

A charismatic figure, Lasota attracted a circle of acolytes, mostly transgender women or nonbinary people, all highly educated, many with jobs in tech and quantitative finance. The group developed its own set of idiosyncratic beliefs and practices, like the idea that people can be trained to sleep in one brain hemisphere at a time, or that transgender women have a distinct cognitive profile suitable for AI research.

The Zizians came to despise mainstream rationalists, and aggressively disrupted their events (the police had to be called to one such protest; officers reported that when interviewed, the protesters emitted strange sounds, as if speaking in tongues). Anna Salamon, a sensible rationalist, told the New York Times she had the impression Lasota wanted to feel special but found herself outshone by others in the tech and AI safety scene. Lasota and friends “came here hoping to become one of the main characters in the story, and then found out that they didn’t get to be one of the main characters. And then I think they were like, ‘To heck with that — we’re going to be the main characters anyway.’”

Then the serious violence started. Zizians lived together on boats on the ocean (the “Rationalist Fleet”) and later in trucks on a plot of land owned by an 80-year-old landlord called Curtis Lind. During a dispute over rent, one of the group stabbed Lind, who pulled a gun and shot two of them (one of them died). Lind survived that attack but in January this year he was stabbed to death.

Maximilian Snyder, a computer programmer and former Oxford University student, is alleged to have killed Lind to keep him from testifying against the Zizians. Snyder never met Lasota, but read her blog. In Vermont, a couple of Zizians got into a gun battle with Border Patrol agent; one Zizian and an agent were killed. The parents of another Zizian have been found dead, shot in the head, in their Pennsylvania home.

Most rationalists are appalled by the Zizians. But as Ozy Brennan - a writer and member of the rationalist community - observes in a fascinating essay, their case is not wholly anomalous. Brennan knows of about half a dozen groups from the community who have gone off the deep end (including two separate groups who communicate with demons). He mentions Leverage Research, which engages in aggressive and intense ‘debugging’ of the brains of employees, akin to Scientology. This group aims to literally take over the world (you can read more about them here). Then there is Black Lotus, a group founded at Burning Man by an allegedly abusive man. This group “developed a metaphysical system based on the tabletop roleplaying game Mage the Ascension.”2

Brennan describes some rationalists as anxious, damaged people in thrall to perfectibility - the idea they can eliminate mental flaws and become superior human beings. Some also have a yearning to be heroic.3 The more extreme groups attract these neurotic and aspirational over-thinkers, particularly those who lack the technical expertise to land jobs as AI safety researchers. Combine with an obsessive disposition and a vulnerability to charismatic personalities, and you have a potential cult member.

After that, the ideas do the rest. A rationalist principle is that one should take “ideas seriously” - once you are convinced by a belief, you must act on it. Another is that you should have “high agency”: pursue rational goals even if it means breaking convention and being penalised for it by society. A third is consequentialism - that means are justified by ends. There’s nothing inherently wrong with these principles but when attached to harmful ideas they form a dangerous and volatile mixture, especially when shared within a tight-knit and isolated group.

In Conflicted I wrote about how one of the reasons for the breakdown of negotiations over the Waco siege is that the FBI couldn’t quite believe that the Branch Davidians believed what they said they believed about the imminent end of the world.4 While some cultists are undoubtedly manipulative psychopaths, we tend to underestimate the sincerity with which people can hold wacky beliefs. As Brennan puts it, “When the Zizians stabbed their landlord…or got in a shootout with the cops, this was really actually because they believed an obscure version of decision theory that meant that you should always escalate when threatened.”

If you’re convinced that you are the only people who can save humanity from extinction then your concept of justifiable means can become rather capacious. (One rationalist tells Brennan, “Something, something dead babies justifies an awful lot”). Rationalist cults are petri dishes for high-IQ irrationality. Intelligence and idiocy are not mutually exclusive ingredients of thinking; they intermingle and compound. Very clever people are better than most at the rational justification of stupid beliefs.

Silicon Valley is full of magical thinking. Those drawn to technological and scientific innovation tend to also be drawn to the fantastical, and it’s not always easy for them to tell which is which. After all, to be a truly great innovator, you must vividly imagine things which don’t exist and be certain that the world will be improved by them. You must be impervious to the rest of the world when it tells you your ideas are ridiculous. It helps to see yourself as the main character of an unfolding historical story. These characteristics are true of the start-up founder and the cult leader.

Tech executives are prone to grandiose, quasi-mystical rhetoric, and, as has often been observed, the mental model that many of them have of AGI is essentially a religious or supernatural one: something is coming, we know not what exactly, that will either save humanity or destroy it. Ilya Sutskever, the engineering visionary behind OpenAI who left the company last year to work on AI safety, told his staff in 2023 that the company needed to build a bunker to save its scientists and engineers - the elect - from the very technology they were working on. A former colleague said, “There is a group of people—Ilya being one of them—who believe that building AGI will bring about a rapture. Literally, a rapture.”

In this context, summoning actual demons does not seem so far-fetched. Indeed, Peter Thiel, who is both rationalist and Christian, talks in vague terms about the coming of the Antichrist. As for Elon Musk - well, let’s not even go there. But this is not a new phenomenon. One of Musk’s most illustrious predecessors in rocketry, Jack Parsons, offers a dramatic example of how tech innovation and occult fantasy have long been intertwined. Parsons, who was from Pasadena, California, was an engineering genius who played a central role in the invention of space rockets. He co-founded the Jet Propulsion Laboratory, now part of NASA.

When Parsons and a group of misfits known as the ‘the Suicide Squad’ began experimenting with rockets in the Pasadena desert, in the late 1930s, nobody in the scientific or military establishment believed it was even possible to send a rocket into space. To suggest otherwise was to be marked as a crank with a mind addled by science fiction. Only in 1943 did the US Air Force start thinking there might be something in it and begin sponsoring Parsons’s group. (The Jet Propulsion Laboratory is so called because"rocket" still carried a whiff of Buck Rogers, whereas “jet”sounded modern and scientific.) Eighteen years later, President Kennedy announced a mission to the moon.

Parsons didn’t get to see that: he blew himself up at home in an accident with rocket fuel in 1952, aged 37 (a crater on the moon is named after him). He was a great innovator; he was also an epic crank. During his early work with rockets he was taking a lot of drugs and participating in a polyamorous devil-worshipping cult. He later became a devotee of the English occultist Aleister Crowley, and a key member of Crowley’s group in the US. After the war he became close to a former naval officer called L. Ron Hubbard, with whom he took part in a series of magical rituals aimed at manifesting the goddess Babalon. (Hubbard borrowed all his money, and his wife, and ran off with both, before founding Scientology.)

Most innovators are not cranks and most cranks are not innovators, but the line between the two mindsets is blurry and indistinct. Similarly, rationalism is mostly, well, rational, but inside it is a germ of madness waiting to infect a vulnerable mind, group, or entire population.

Share


This post is free to read so so feel free to share if you enjoyed it, and sign up to The Ruffian if you haven’t already by hitting the button above. In this cult, paid subscribers are the elect. I literally couldn’t do this without you. You will get your reward in heaven and also after the jump.

Speaking of which: after the jump: a Rattle Bag of things interesting and beautiful…

Keep reading with a 7-day free trial

Subscribe to The Ruffian to keep reading this post and get 7 days of free access to the full post archives.

Already a paid subscriber? Sign in
© 2025 Ian Leslie
Privacy ∙ Terms ∙ Collection notice
Start writingGet the app
Substack is the home for great culture

Share