A far-out cognitive model and moral philosophy
March 2, 2025 3:54 AM   Subscribe

Our alleged shooters--it has become clear over the past week--weren’t the thuggish murderers and rapists of fevered reactionary imagination, but Rationalists: Members of a large and loose community of autodidact and amateur philosophers and activists focused on self-described rationalist inquiry and self-improvement that’s become influential in A.I. research and philanthropy. from The Zizians and the Rationalist death cults by Max Read posted by chavenet (52 comments total) 24 users marked this as a favorite
 
these people are arguably less respectable than thuggish murderers
posted by AlbertCalavicci at 4:43 AM on March 2 [4 favorites]


What group doesn't have cults associated with it? There were actual warnings about Ziz, not "warnings" as stated in the article.

High ambitions for improvement do make people vulnerable to cults, but that might apply to progressivism, too. I don't think there are mainstream political cults, but I might be missing something.
posted by Nancy Lebovitz at 4:56 AM on March 2 [3 favorites]


I'm fascinated by the phrase "vegan Sith."
posted by doctornemo at 5:56 AM on March 2 [9 favorites]


I have been so worried that this would be used as yet another pretext to target trans people and so far, despite the current anti-trans politics, that hasn't happened and the focus has been on the Rationalist/cult aspects. Knock on wood for that to continue.
posted by kokaku at 6:04 AM on March 2 [20 favorites]


My reaction to such stories now is 'how do I raise my boys so that they are inoculated against this sadness'. Right now now all my money is on inculcating a cheerful sense of absurdity. Open to hearing any other ideas!
posted by SnowRottie at 6:18 AM on March 2 [19 favorites]


I can't think of a more polite way to say it, and I'm not sure why I should bother looking for one: the reason this story will not become part of the anti-trans dialogue is that this story is too long and complicated for anti-trans rhetoricians to process. After a few minutes, they'll just start to look for stories about transwomen using ladies' restrooms again. This is too much effort.
posted by kittens for breakfast at 6:18 AM on March 2 [16 favorites]


Nancy Lebovitz, above, is correct.

Since at least April 2020 according to the Internet Archive, someone going under the pseudonym Apollo Mojave* has been maintaining a website about the Zizians' dangerousness.

Here's the Internet Archive's capture from back then but the site is still up if you want to look at current version.

*at least I'm assuming it's a pseudonym given that Apollo Mojave was the name of one of the characters in a quadrilogy by Ada Palmer. If you are interested in the question "What if there were nations, each founded by a different TESCREAL** ideology?" then... well, the quadrilogy is about that.

**TESCREAL is a acronym referring to a group of loosely related ideologies: Transhumanism, Extropianism, Singularitarianism, Cosmism, Rationalism, Effective altruism, and longtermism"
posted by pickles_have_souls at 6:44 AM on March 2 [12 favorites]


this story is too long and complicated for anti-trans rhetoricians to process

On the contrary, TERFs have been banging on about connections between transness and transhumanism for some time. It'll be the same old grist for the same sad mill.
posted by mittens at 6:55 AM on March 2 [3 favorites]


I mean the violence part doesn't seem to me to be out there or complicated. It's a group of people unwilling to accept coercion and willing to escalate indefinitely encountering a power structure willing to escalate violent coercion indefinitely.

It's just that without some zealous belief, most people are going to accept the coercion at some point and get on with their lives.

Open to hearing any other ideas!

Honestly, i think the way to avoid this sort of thinking is to know effective means to oppose and resist systems you think are oppressing you rather than giving up and going out in a blaze of glory such as it is.
posted by Zalzidrax at 7:12 AM on March 2 [6 favorites]


I don't think there are mainstream political cults

I'm sure I misunderstand you, but MAGA?
posted by ginger.beef at 7:17 AM on March 2 [18 favorites]


...assuming it's a pseudonym given that Apollo Mojave was the name of one of the characters in a quadrilogy by Ada Palmer

I would not be entirely surprised if that is not the only connection, because she was up to some shady `running a compound` shit in the 2000`s.
posted by Weighted Companion Cube at 7:32 AM on March 2 [4 favorites]


I got a bit annoyed with the conflation of the real-world challenges operating AI systems safely and securely (eg: alignment, which is not a problem that needs air quotes) and the batshit crazy thinking that seems to permeate Singularity Worshippers in the tech exec class.
posted by simra at 7:33 AM on March 2 [5 favorites]


how to raise good kids? have them map everything through a lens of "is this kind to everyone, not just one group or people with power"
posted by seanmpuckett at 7:37 AM on March 2 [16 favorites]


I think humans form cults pretty often, including militaries, political parties, many active religious groups, and some companies, so alone "cults" seem neither avoidable nor descriptive enough.

I agree with kittens about this story sounds too complicated for right-wing rhetoric. Also, vegans are like 3.2% of the population in Europe now, and transgender people are like 0.1% - 0.6% of the population, so both really large diverse groups of people. Zizians#Members lists three people.

As alwasy, your real question should be: Where does the magical thinking and money/power show up? Answer: ".. that’s become influential in A.I. research and philanthropy."

AI pumpers and AI x-riskers wouold not usually promote random killings, but now they do transform the vast wealth from AI fund raising into delusional frantic media that facilitates further AI fund raising and power grabs by billionaries. We do not care too much that this delusional frantic media causes a few random murders, but more how AI bullshit impacts overall political alignment and resource allocation.

Examples:
- Europe could suffer a nuclear meltdown in part because of aging nuclear power plants being kept online to fuel AI training. That future seems unavoidable in the US now.
- European leaders could inadvertently leak tactical details in the Ukraine war through collaboration with American AI companies, especially via Chat Control.

I suppose the Zizians might help discredit AI bullshit, but really I'd think that work lies elsewhere.
posted by jeffburdges at 7:44 AM on March 2 [4 favorites]


how to raise good kids? have them map everything through a lens of "is this kind to everyone, not just one group or people with power"

And crucially, "everyone" must mean "everyone who is alive right now, not some mythical 'everyone' comprised of fictional people we believe will exist in the future, and whose well-being supercedes that of the real people who are really alive today."
posted by kittens for breakfast at 7:55 AM on March 2 [18 favorites]


ginger.beef Yes, you did misunderstand me, and I wasn't ideally clear. I was thinking of moderate Democrats and Republicans. MAGA is radical, not mainstream.

As for raising kids who aren't vulnerable to cults, cults recruit people who are desperate and/or unmoored. Poor people, people who's lost a spouse, people who've just moved, which includes college students..

This means that the good will and/or good sense needs to be stable under pressure.
posted by Nancy Lebovitz at 8:12 AM on March 2 [3 favorites]


>MAGA is radical, not mainstream

MAGA is the movement that supports the literal president who won the popular vote and for some reason has an above 30% approval rating
posted by fomhar at 8:27 AM on March 2 [12 favorites]


I would not be entirely surprised if that is not the only connection, because she was up to some shady `running a compound` shit in the 2000`s.

Wait, Palmer or Mojave?
posted by GenjiandProust at 8:29 AM on March 2 [8 favorites]


On the contrary, TERFs have been banging on about connections between transness and transhumanism for some time.

Periodic reminder that to be a TERF you have to be from the Radical Feminism region of Ideology. If you're not from there and hate trans people, you're just a sparkling bigot.
posted by GCU Sweet and Full of Grace at 9:11 AM on March 2 [30 favorites]


Humans evolved in personality cults and most cultures contain junkyards of their failed prophecies (which is how they persist and why they have extensive pedigrees). Though cult leaders are more studied than followers they both go hand in hand, or lock and key. Cult authority is any narcissist finding validation through a following, while the followers are seeking acceptance, with no particular background having primed them for this. The follower's problem is never solved but actually compounded after recruitment. The nightmare of seeking acceptance by threat of removal has only begun for them. They can murder because they take on a gang persona, often having never had a sense of individuality. One is in a cult when words are used to teach fanatical responsibility and never again to describe reality. Found this gem in the internet archive link above from pickles_have_souls:

In the mind of a Zizian, liberalism means the same thing it does for a Communist or Nazi: an opportunity for decadence, vice, and corruption. Because we "already know" the correct moral policy is a ruthlessly enforced altruism towards all or almost all living creatures; supporting liberalism is to participate in crime on a mass scale. They believe that now is the time to [betray our corrupt society] and establish a vegan hegemony.
posted by Brian B. at 9:25 AM on March 2 [5 favorites]


Why So Slow? discusses research that having been responsible for taking care of younger children is a strong marker for not being vulnerable to — I think it’s impulsive violence the studies were about, but despair in the large.

Might be correlation obvs.

Now I’m curious about how care experience affects vulnerability to TESCREAL, etc.
posted by clew at 10:11 AM on March 2 [2 favorites]


Been following this one lately and it's easily one of the most cyberpunk stories I've heard in years. Unfortunately their crimes are just garden variety stupid, sad and really make it seem these folks aren't smart in any sort of way at all
posted by bannana at 10:18 AM on March 2 [5 favorites]


I don't think there are mainstream political cults

The main religious driver of the MAGA operation is The New Apostolic Reformation, (plus Catholic integralism and some Orthodox elements - hence all the love for Russia), see Texas Observer for former The New Apostolic Reformation Wants God's Government Back Very mainstream and the main reason Trump got in.
posted by unearthed at 10:56 AM on March 2 [8 favorites]


But when you poke around a little bit--when you read about how “Rationalism” plays out in its communities in practice--the “movement” starts to feel a little less STEM and a lot more New Age: A suspect program for self-improvement, with existential stakes, a strong emphasis on self-experimentation, and a deleterious commitment to openness. In this sense its predecessors are not really the original enlightenment rationalists, but the dubious touchstones of ‘60s-hangover California--Scientology and Dianetics, Werner Erhard and est, the Symbionese Liberation Army and the Manson Family.

I had been trying for years to figure out why the "Rationalists" annoy me so much, and this made me finally realize that they remind me of a former coworker who was raised in est and who could go from being a normal person, if a bit sheltered and self-centered, to saying the most batshit stuff so fast.
posted by hydropsyche at 11:01 AM on March 2 [10 favorites]


From the Wired piece, the one moment I laughed aloud:
He gave off the vibe of a private investigator, and when I introduced myself he confirmed as much, declining to give his name. “You can call me … Cliff,” he said, unconvincingly.
chavenet, thank you for posting this; edifying and sad.
posted by brainwane at 12:07 PM on March 2 [3 favorites]


"Why So Slow? discusses research that having been responsible for taking care of younger children is a strong marker for not being vulnerable to — I think it’s impulsive violence the studies were about, but despair in the large. "

Has anyone done any similar research on owning pets? They've saved a lot of people.
posted by aleph at 12:24 PM on March 2 [2 favorites]


Every generation gets the sci-fi cult they deserve... scientology, est, rationalism...
posted by kaibutsu at 12:25 PM on March 2 [2 favorites]


If only the older ones wouldn't keep coming back as well!
posted by aleph at 1:11 PM on March 2 [4 favorites]


One of members was a graduate of a local university (to me) and made the news here.
posted by Ashwagandha at 2:34 PM on March 2 [2 favorites]


A few thoughts:

1. Rationalists seem like a group of people who are a) fairly smart, b) mostly trained in the applied sciences, and c) usually not trained in the sort of thinking one gets in a Humanities curriculum. When they take on the Philosophy, it's a bit like Sovereign Citizens taking on law or Alternative Medicine people taking on medicine -- you get a distorted form of the thing with no real understanding of the structure that underlies the practice. This is one reason why their manifestos are full of impenetrable jargon -- that's the way that a lot Philosophy appears to the casual reader. As a bonus, it makes it almost impossible to argue with, because you can always retreat to a position of "you don't understand my words." Also, if your credo requires frequent references to popular culture, you are likely not that rigorous a thinker.

2. This is an environment that attracts a lot of fairly bright people, many of whom have caught the brass ring of a "good job in tech" but in an environment where out of control rent, degrading infrastructure, and precarious employment undercuts their sense of self worth and stability. By the odds, some of them will be dealing with any number of other issues -- mental illness, childhood trauma, gender issues, neurodiversity, etc -- which can make them even more vulnerable, especially if they lack a social safety net.

3. In this environment, development of cult thinking seems almost certain, once you have a large enough population and enough time. Grifters looking for donations to "stop rogue AI" or other grand disasters sow seeds that grow all sorts of unintended damage. There is also a strong strain of Libertarian delusion in the group, which further destabilizes people. "The strong individual" is a great idea if you are well-supported in your life; if you aren't, it's just another failure to experience.

4. I knew a guy who had lived in a bunch of communes across the US in the 60s, and he said that everyone in the scene had a vague sense that it was very likely that some group living situation or other was going to get really toxic, and that no one in his circle was surprised (although they were shocked) when the Manson Family was exposed. Given that, in the 60s, you could make a subsistence living for a fairly large group with the aprt time labor of a few members, and these days, there's a lot more stress just to maintain housing and food, the heat is turned up further.

5. So, on one hand, the specific ideology is garish and titilatingly outre, but the broad outlines of the group are neither unusual or unlikely.

6. As for the trans angle, the Wired story makes it clear that the Rationalist scene has a fairly high number of trans people in it, so it's hardly surprising that any given group might have more than one trans member. I agree that it's probably for the best that the story is too complicated to explain for Right Wing forces to get much traction with it.
posted by GenjiandProust at 2:49 PM on March 2 [13 favorites]


CW on Ashwagandha's Waterloo Region Record link: it's really bad on names and pronouns
posted by scruss at 2:51 PM on March 2 [3 favorites]


Sorry about that - I clearly missed that.
posted by Ashwagandha at 3:42 PM on March 2 [2 favorites]


Oh, also:

7. Messing around with drugs, sleep deprivation, and trying to induce altered neurological states seems like a risky idea, even with excellent support and an otherwise stable life. There is a reason why even something as seemingly simple as meditation should be done in a group with supervision and regular check ins, and what these groups are doing is not that. Deliberately trying to induce paranoia in your friends is… not a great life choice.
posted by GenjiandProust at 4:13 PM on March 2 [7 favorites]


I was writing a comment to the effect that the only reason conservatives hadn't pounced on this story was because they hadn't heard about it yet, when I learned that Ziz has been apprehended and Fox News is spinning this as a story about the leader of a "transgender cult" who is demanding vegan food in jail. (Like so. Warning: Fox News.) So, here we fucking go.
posted by Ursula Hitler at 4:16 PM on March 2 [3 favorites]


What group doesn't have cults associated with it?

I wish more of the reporting on Ziz would dig into the history of the Rationalist movement. It’s like covering a murder-suicide by a splinter group from a radical church and quoting the church leader as a neutral scholar of theology. Eliezer Yudkowsky and Scott Alexander are damaged people, and in turn they have damaged so many others.

LessWrong, MIRI/CFAR, SSC/ACX, and all the communities that have sprung from them are not the same as bowling clubs or political action committees. Personality worship, narcissism, and abuse of power are common in human groups. Messianic fervor, repeated proclamations that all humans will die, delusions of grandeur, quasi-religious scriptures, arcane in-group jargon, persistent devaluing of out-groups and “normies”, persistent fascination with eugenics and IQ, persistent anger directed at feminists and social-justice movements, using your fan base for sex, explicitly recruiting youth with HP fan fiction and then graduating to doomsday erotica, funding the fanfic and doomsday erotica with nonprofit money, multiple suicides, multiple psychotic breaks, and multiple former members calling your movement a cult are not.
posted by the socks of enchantment at 4:34 PM on March 2 [22 favorites]


TESCREAL is a fucking cancer running through both the Valley and certain AI circles in particular. I’ve called out Yudkowsky as being among the worst of the cranks within the latter on a couple occasions and it was no surprise to read about him in a quasi-cult leader role, here.

And while most of what GenjiandProust wrote for #2 - and only that point - applies to me, I think the reason I found it so comparatively easy to bail out of early aughts transhumanism circa 2003 was because I grew up within an evangelical fundie framework so intense and all-consuming that it bordered on cultlike. I’d already seen this show dressed up with scripture, and I was absolutely done with losing years of my life to anything similar by the time I turned 19. I wouldn’t call my upbringing “lucky” by any stretch, but… I think it helped here.

I also think that Genji’s first point is spot-on, but I haven’t found a way to write truthfully about it that isn’t uncomfortably stigmatizing to people with both spectrum disorder and borderline personality disorder. As someone who is only in the first of those two groups, I need to leave writing about it to others. But: you’re not wrong.
posted by Ryvar at 6:53 PM on March 2 [9 favorites]


If you're worried about being infinitely tortured by an AI version of a psychopathic CEO/Old Testament god, maybe, step one, (and mind you, I'm no rationalist), don't create it?

I was brought up religious. It didn't stick. But, like Ryvar, it's proven a good inoculation against this particular flavor of insidious brainworm*.

The step from pursuing rational thought (generally: in favor), to almost literally recreating Abrhamic religion, complete with wrathful god (generally: not in favor), from first principles appears to be frightfully small.

I can't say it's super surprising; we are a superstitious lot, but come on team Rational Thought, I expected more.

* I don't want to paint brain worms with a broad brush. Some of them are fighting the good fight.
posted by chromecow at 8:37 PM on March 2 [3 favorites]


I feel like I should clarify a point that I think should be self evident, but all this.

I don't believe in super AI, any more than I believe in Nanotechnology (of the breathless grey goo variety), or personal jet packs. Any time we make technological breakthroughs, we envision the future they create in apocalyptic, messianic god-like, magical ways. It is our nature. We envision Nanotecnology as microscopic 19th century factories. Atomic Energy will be too cheap to meter. AI will become gods that can resurrect us and torture us forever.

Some technologies change us forever in ways we can't even imagine. Some are apocalyptic. Internal combustion may yet destroy us as a species (but hey, don't worry about that one, techbros).

The one thing you can pretty much bank on is that what ever change is immanentized, it won't be in the way we predict.

Like, who expected the internet to metastasize global death cults?

posted by chromecow at 10:33 PM on March 2 [6 favorites]


I think John Brunner may have called that back in the 70s. Seriously, that guy was uncanny like Octavia Butler was uncanny. I wonder if they ever met at a Cassandras Convention.
posted by kokaku at 11:19 PM on March 2 [7 favorites]


"If you're worried about being infinitely tortured by an AI version of a psychopathic CEO/Old Testament god, maybe, step one, (and mind you, I'm no rationalist), don't create it? "

Some people were evidently permanently traumatised by I Have No Mouth And I Must Scream.

(If you don't get the reference, and maybe search, I suggest NOT checking out the actual story, Harlan Ellison's work is misogynist even by the standards of his time).
posted by i_am_joe's_spleen at 1:07 AM on March 3 [2 favorites]


research that having been responsible for taking care of younger children is a strong marker for not being vulnerable to — I think it’s impulsive violence the studies were about, but despair in the large.

Having been the primary caregiver for my kids when they were little, and now as teens, I want to contradict this. Knowing that they're going to have to come of age in this shit soaks me with deep despair, all the way to the bone. I don't have a basis for comparison, since I can't talk to the me that didn't have kids, but I often think that if I didn't have the girls, I could just walk into the sea once things get worse, as they surely will—CNN's main page right now is blaming Zelensky for the debacle in the White House—but I have to be around for them, and it's painful.
posted by outgrown_hobnail at 4:24 AM on March 3 [2 favorites]


"If you're worried about being infinitely tortured by an AI version of a psychopathic CEO/Old Testament god, maybe, step one, (and mind you, I'm no rationalist), don't create it? "

Some people were evidently permanently traumatised by I Have No Mouth And I Must Scream.


If you would like to grasp the substance of this formative text, the recent horror movie Skinamarink is pretty much the same thing without science fiction trappings.
posted by kittens for breakfast at 7:06 AM on March 3


this story is too long and complicated for anti-trans rhetoricians to process

But that's the thing, and the thing with "Rationalists" in general: it's not complicated at all.

There's just an illusion of complexity because of the smokescreen of jargon and seemingly-complex concepts. This person Rationalists get so far up their own asses that they lose the simple ability to just call something bullshit which makes them stupid and vulnerable. Ziz was telling people that some people have half-good brains, some people have double bad brains, but they are the rare person who has a double good brain. This is simply insane and doesn't require deep investigation to counteract, you just walk away immediately upon hearing this. Anyone who doesn't is themselves deeply broken.

To actually survive in the real world you need to be able to look at something and say "nah, this is insane" without having to reconstruct the world from first principles.
posted by star gentle uterus at 7:45 AM on March 3 [11 favorites]


I can't tell you how disappointed I was to hear that the Zizian murderers were a cult that formed around someone named Ziz instead of Slavoj Žižek.
posted by a faded photo of their beloved at 7:46 AM on March 3 [8 favorites]


I worked with two different Yudkowsky followers between 2006 and 2015. They tried to recruit me, maybe because I like to get into deep and long discussions based on silly premises that lead to completely rational but insane conclusions. Something I learned from the Jesuits and sci fi.

Having broken out of Catholicism and having had brushes with other cults when I was traveling lots and doing lots of drugs, I was never able to take the whole thing seriously.

To be completely honest, I find myself in a situation all these years later where I wish I had listened to them and followed their advice. My life would be a lot better now. They were telling me to buy bitcoin when it was worth less than a dollar.

Kind of joking in the last paragraph, in reality I was a bit worried what these guys were going to do with all that money. One is still in the Silicon Valley AI sphere, the other one is doing cool artsy things and funding a lot of open source projects and creative spaces.

And about children. I did not plan to have any. But I did, and it is great, and I can not imagine a different life. It has made me kinder in some ways, more selfish in others, but it has completely removed the possibility of making radical life changes. The same way it makes it very hard to decide to just quit a job and move to another country to see what happens, which I did I a few times before, it makes it unthinkable to join any radical organization. I’d need to know what school district the collective dorms are in, whether the cults vacations align with school calendar, etc…
posted by Dr. Curare at 7:50 AM on March 3 [2 favorites]


GenjiandProust, Mojave is the name of a fictional character who lives in like the 2400s, so I figure Weighted Companion Cube is talking about Palmer.

Weighted Companion Cube, I know Palmer a little bit, and I'm somewhat skeptical of what you're saying. I know that Palmer does live with a group of friends and that her housemates are a big part of her life, and in particular, a big part of how she copes with her disability; it feels potentially really hurtful to be accusing a person of "shady 'running a compound' shit", especially in those circumstances and without backing it up with some evidence, and especially when you're implying that she's in any way connected to the cult discussed in the main part of the thread.
posted by brainwane at 10:05 AM on March 3 [1 favorite]


brainwane, the Apollo Mojave in this case is a blogger (named after the Palmer character, likely) referenced in pickles_have_souls comment towards the start of the thread. I’m honestly not sure who Weighted Companion Cube was referring to, since it’s not impossible that they know more about the blogger. Thus my confusion.
posted by GenjiandProust at 10:42 AM on March 3 [1 favorite]


GenjiandProust: AHHHHHHHH, okay, thank you for that - I'm sorry for my presumption and confusion and misunderstanding there!
posted by brainwane at 12:35 PM on March 3 [2 favorites]


Trueanon 👀👀
posted by youthenrage at 1:25 PM on March 3


I don't think there are mainstream political cults, but I might be missing something.

I heard there's one in the US that recently won a presidential election.
posted by ZenMasterThis at 4:07 AM on March 4


There's a longread in the Guardian that goes over the whole story so far: They wanted to save us from a dark AI future. Then six people were killed
(archive.is / archive.org)
posted by scruss at 8:29 AM on March 5 [3 favorites]


One of the intriguing features of the Terra Ignota novels is the way they reflect on certain preoccupations of the rationalist community -- should we go to Mars? should we destroy this world to create a better one? -- and the cult-like pathologies that emerge when a small group of highly intelligent people live together as housemates (or bashmates, to use the novels' own terminology).

As far as I'm aware, Palmer isn't directly involved with the Bay Area rationalist scene, but I think it's pretty clear from the novels that she knows people who are, or who are at least adjacent to that scene. It's just a pity that in our accursed timeline, we got Elon Musk rather than Cato Weeksbooth and the Chicago Museum of Science & Industry Junior Scientists Squad.
posted by verstegan at 12:31 AM on March 6 [2 favorites]


« Older Next stop: Mare Crisium near Mons Latreille   |   Just one more sprite bro Newer »


This thread has been archived and is closed to new comments