r/SneerClub archives
newest
bestest
longest
In which trans people are penis stealing witches (https://astralcodexten.substack.com/p/book-review-the-geography-of-madness)
32

Now everybody is Westernized and has Western fears like vaccine injury or structural racism.

Oh for fuck’s sake, Scott.

does this not imply sissy porn would work

We should Clockwork Orange him with some, y'know, just to verify.
... Was there some discourse I missed about how sissy porn is bad because it doesn't work? Link?
Have you seen The Stone Tape? Like that but RW gender metaphysics instead of ghosts.

Weirdly he seems to be suggesting that GC anti-trans types are the penis stealing witches in the final paragraph.

That concluding paragraph seems like a really awkward attempt to tie in the thing he wanted to talk about (is gender dysphoria / transness culturally determined instead of being “real”, whatever “real” means in this context) with where he began, like you’re “supposed” to do when you write these kind of essays. It’s all a bit tenuous though & doesn’t really work, so the essay ends up being something of a having cake & eating it hack job, with a mealy mouthed conclusion in the end.

I preferred the Scott who wrote “The Categories Were Made for Man Not Man for the Categories” personally.

He requires a truly enormous number of words to explain to his audience that he thinks gender dysphoria is complicated and that he doesn't know how to accurately account for its causes.
> He requires a truly enormous number of words to explain ... that he doesn't know ... This is every SSC essay ever?
The beigeness!
See I read the thing he *wrote* but I think I understood what he wanted to *say*.

This seems like a really suspect work of psychiatry. SA says all of the following things about culture bound illness:

THESE PEOPLE ARE NOT MAKING IT UP.

People have died from these conditions

their condition only afflicts them because they believe in it, much like with koro.

Some people have the condition for a normal biological or psychiatric reason

If your culture believes in science, they come up with a whole theory about how the Lyme disease spirochete can persist even after apparently successfuly treatment and cause chronic Lyme disease. If your culture believes in feminism, they talk about how patriarchal beauty standards cause women to have an uncontrollable urge to diet themselves to death in order to look sexy for men.

I think you could probably have a culture where 99% of people were transgender

Culture bound illnesses are real conditions with serious adverse health consequences. They are also entirely in your mind and exactly equivalent pseudo-scientific conspiracy theories.

He notes that koro - the vanishing penis thing - is listed in the DSM V. DSM V categorizes koro as being related to OCD, and defines it as

an episode of sudden and intense anxiety that the penis (or the vulva and nipples in females) will recede into the body, possibly leading to death.

The DSM V is explicit about some of the relevant, distinguishing features of koro vs other conditions:

Koro […] is often accompanied by a belief that death will result. Koro differs from body dysmorphic disorder in several ways, including a focus on death rather than preoccupation with perceived ugliness.

There’s a pretty big difference between “haha they think witches stole their penis” and “they experience serious anxiety and fear of death while also being preoccupied by the state of their penis”. SA never once engages with the possibility that the anxiety and fear of death might really be the core issues at hand.

People conventionally use the word “belief” to imply the availability of choice and the possibility of change: you can readily believe different things after getting new information. A mental illness, in this sense, can never be the product of mere belief, and yet that’s the entire framework that SA uses for presenting it here.

Bizarrely, he writes this entire blog post without ever discussing treatment options for these conditions. How is it possible for a professional psychiatrist to write at excessive length about mental disorders without talking about their diagnosis or treatment?

Speaking of pseudoscience, he veers into some pretty weird and speculative stuff towards the end. For example, he says this: > Chronic pain is a unfortunately a bog-standard sensitization problem plus trapped prior; panic disorder is probably something similar.

I am not a medical doctor, but I feel pretty confident that “trapped priors” are not a part of the usual description of chronic pain or panic disorders.

Wait does a "trapped prior" mean it's trapped in the body?!
I *think* that by 'trapped prior' he means that 'if you have chronic pain or panic disorder you subconsciously expect to experience pain or panic, which leads to more pain or panic' but he can't talk like a normal person.
The lingo is very useful for some things and very very bad for others.
Thetans yo.
I was thinking how we do hold pain and emotions in the body more than thetans but sure.
...that wouldn't explain my back tension would it?
It very well might! I would suggest asking your back and any emotions you feel might be hiding out there, see what happens. (seriously.)
I will be 100% not surprised if they eventually come out with expensive seminars (only available to people high enough in the hierarchy) that promise to reprogram you by updating all your bad priors.
CFAR
Rationalist Conversion Therapy
hey baby you wanna get together and exchange some mutual information *waggles eyes*
Wtf lmao
So in other words, he's doing a rationalist variant on "chronic pain is all in your head, you hysterical crybabies?" As someone with chronic pain, I now want to megaton-punch this man through a wall more than ever.
"Bizarrely, he writes this entire blog post without ever discussing treatment options for these conditions. How is it possible for a professional psychiatrist to write at excessive length about mental disorders without talking about their diagnosis or treatment?" That would require having empathy and an actual desire to help others.
Another word for “priors” would be a schema, which is definitely a part of how a lot of academics think about chronic pain ([example](https://scholar.google.com/scholar?hl=sv&as_sdt=0%2C5&q=pain+schema&btnG=#d=gs_qabs&t=1677739275533&u=%23p%3D-c7hCGT6pA4J) ) A trapped prior would be a self-reinforcing schema (I think) which would be a common idea in psychological treatments of chronic pain.
It's possible that that's what scott alexander means when he says "prior", but that would still be bad for two reasons: 1. it would mean that he's pointlessly using his own neologism for an idea that already has accepted terminology, meaning that he either isn't familiar with the field of study that he's talking about or he's deliberately obfuscating the matter for his audience, and 2. it would still be pseudoscience A colloquial definition of "prior" might be "stuff that you believe is true before gathering more data", which is good enough if you're explaining it in 5 minutes to someone at a bar, but it's not the real definition of a prior. A prior is a probability distribution. And calling something a "prior" implies that you're going to use Bayes' theorem at some point. The human brain almost certainly does not represent states of knowledge as probability distributions, and it also almost certainly does not use Bayes' theorem in any meaningful sense. You can find some papers claiming otherwise but they're probably wrong; papers like that are usually written by biology-types who don't know the math, or math-types who don't know the biology, and who (either way) are always trying to make a splashy claim in order to get their papers published and cited. I'll also note that there is, in general, no such thing as a "trapped prior"; that's a scott alexander neologism that isn't in common use because there's always a preexisting terminology for whatever he's using it to refer to. If someone needs a general term for "a thing that doesn't change because it reinforces itself", there are already many different terms for that. The one that mathematicians use is "fixed point", and it's very well-known.
>it would mean that he's pointlessly using his own neologism for an idea that already has accepted terminology, meaning that he either isn't familiar with the field of study that he's talking about or he's deliberately obfuscating the matter for his audience, I remember I read [this paper](https://www.nature.com/articles/nrn2536) in nature review neuroscience a couple of years back. It uses a bayesian approach to explaining hallucinations in schizophrenia. It has been cited 1400 times. They write "It has been proposed that a hierarchical Bayesian system might be a basic principle for brain function" and cite 4 different papers. So I don't think using language borrowed from Bayesian statistics is in any way fringe when it comes to perception and neuroscience.
I didn't say it was fringe science, I said it was pseudoscience :P That paper is a good example of what I'm talking about. What it describes is not Bayesian reasoning. It's describing something like a residual model, but it's hard to say more than that because there aren't any equations in the entire paper. The same is true of [the most recent paper](https://www.cell.com/neuron/fulltext/S0896-6273(08)00456-X) that they cite for the Bayesian hierarchical stuff. This is basically a careless and inaccurate use of terminology by people who don't understand it. They seem to think that any system that updates an internal state of knowledge based on new information counts as "Bayesian", but that's silly. I do sympathize though. Brain science is incredibly hard and people need to publish papers. It feels awkward to write things like "Here are some interesting experimental results. We cannot properly contextualize their significance because the brain is too poorly understood". And nobody is going to call you out on attributing your results to a "Bayesian hierarchy", especially when they're your colleagues and they also don't know what Bayesian reasoning is. Credit to the authors of the paper that I linked, though, because they basically do say exactly that right at the beginning of their abstract: > Perceptual inference is biased by foreknowledge about what is probable or possible. How prior expectations are neurally represented during visual perception, however, remains unknown.
It’s not trying to describe Bayesian reasoning, it’s about perception/hallucinations. Pseudoscience seems like a pretty harsh term for this. It seems like you want scientists to mean something very specific about how brains or neurons represent information through probability distributions, but they apparently don’t use it that way (because they’re not delusional about what we do and don’t know about the brain). That they mean something else by these terms doesn’t make it pseudoscience. You yourself earlier suggested using “fixed point” from mathematics, which would be an similarly loose and metaphorical use. I think “rationalists” use “prior”, “updating”, “baysian” in a silly ways to sound smart but surely you can see that it’s not some scott specific neologism? In the context of perception and interpretation of stimuli it ties into an active research programme in neuroscience. Maybe he does that badly, but it looks to me that you’re doubling down too hard here in saying that the scientists the rationalists are imitating are themselves terrible.
I personally do regard the use of terms like "Bayesian hierarchy" to describe the functioning of the brain as a terminological veneer that serves to obfuscate the reality of our ignorance rather than to accurately describe what we actually know. I don't think it's delusional, I think it's mostly just marketing, and in that respect it is symptomatic of some of the unproductive incentives that exist in academic science. The Rationalists are worse, of course, because they don't do it as marketing - they take it all quite literally. But I don't know that real science is served by indulging impulses towards marketing, because I think use of these terms has convinced some real scientists that there is something quite literally Bayesian about how the brain works, despite the fact that there is no evidence (that I know of?) to support that hypothesis. Hence my use of the word pseudoscience. And to be clear I object most strenuously to the term "trapped prior", which is certainly made up. Regular "priors" are a real thing of course.
>I personally do regard the use of terms like "Bayesian hierarchy" to describe the functioning of the brain as a terminological veneer that serves to obfuscate the reality of our ignorance rather than to accurately describe what we actually know. That's a bit too inside baseball for me. I'm a psychologist not a neuroscientist, but there are some researchers doing neuroscience in my research group that are keen on the (related) predictive processing framework so if you can link to some good write-up of that criticism we'd find that interesting. ​ Regarding the "trapped prior", he links [the post where he defines the term](https://astralcodexten.substack.com/p/trapped-priors-as-a-basic-problem), and I don't think there's anything wrong with how he defines it or links it to perception: "This is the trapped prior. It's trapped because it can never update, no matter what evidence you get. You can have a million good experiences with dogs in a row, and each one will just etch your fear of dogs deeper into your system. Your prior fear of dogs determines your present experience, which in turn becomes the deranged prior for future encounters." It's rewording ordinary cognitive behavioural psychology from 40 years ago in the language of this "Bayesian" perception stuff, which may be unnecessary, but also fine if you view it as pedagogy for a crowd used to thinking in those words. Following those definitions (or translating them to corresponding terms) I don't think the original quote about pain and panic comes of as pseudoscientific.
I'm not sure if anyone writes up criticism of this stuff. The criticisms are kind of obvious if you know the math and modeling well, but neuroscientists tend to focus on other things than that. The best advice I'd have for your colleagues is to hit the math really hard, otherwise they'll just end up (at best) reinventing stuff or making a big deal out of trivialities and thus accomplishing nothing. And I don't agree that it is appropriate for Scott Alexander to be making up words to "educate" his audience. His audience doesn't even understand math for the most part, so I don't see how making up words to appeal to their fetishism of Bayesian probability could help. A good educator, at the very least, gives their audience the right google search terms.
Ok so let me get this right: The pressure to publish leads researchers to use Bayesian terms when talking about the brain (and then get published in one of the most prestigious journals in neuroscience and cited thousands of times). There’s a critique of that that renders the whole thing Pseudoscience! However, you also think no one writes and publishes that, because it’s “kind of obvious if you know the math”. I’m starting to think you don’t know what you’re talking about. Sorry.

If there’s a good faith reading of this I’m missing somebody please let me know. Good old SS never ceases to amaze.

I can't believe I just read more-or-less all of that, but I think the idea is more that trans people are "penis stealing witches" in the sense that they are the focus of irrational cultural hysteria, which isn't wrong. In other words, this is Scooter's version of pushing back against the current transphobic moral panic. It's badly done of course. Which isn't to defend the piece, because it's pretty much shit.
Scott appears to have seen a cool documentary about culturally-bounded health issues and found the phrase "penis-stealing witches" entertaining (understandably) and applied his trademark of connecting it to a culture war flashpoint and writing in such a roundabout and long-winded way that regardless of his actual point you can plausibly argue that he's anywhere in the neoliberal -> libertarian -> cryptofascist pipeline.
[I'm playing both sides, so that I always come out on top](https://i.kym-cdn.com/entries/icons/original/000/036/386/I'm_Playing_Both_Sides_Banner.jpg)
[deleted]
They always seem to come back, though. What I find interesting about this whole thing is that I don't think koro meaningfully connects to gender dysphoria at all. It might, but there are a hundred reasons I don't think it does. So the sudden swerve at the end toward "we could have a comptrans culture, huehue, but we don't because ew" and other things was like *ohhhhh*, right, I'm dumb haha. GCs are *obsessed* with transmascs. I don't think it's that he forgot them so much as that blaming groomers for stealing the pool of available fertile teenagers kinda gives the game away.

I like that he jumps through all those hoops to explain why he felt gender dysphoria was cultural and not biological, just to say it doesn’t matter. Then what was the point of the article Scott?

It certainly wasn't what you assume it was~ that would be Bad Faith~

Wait, he has a wife?

a beard imo but yes
Why do say a beard? Who the heck is it anyway?
Because he’s apparently having sex with men at rationalist meetups while pretending to be a straight dude looking forwards to having a son. Its really weird. But I’ve noticed the weird biphobia or whatever before.
Is this real or just some kind of rumor that's tempting because it fits our bias? Plus, like, having an open marriage (as opposed to a beard) isn't exactly improbable given how much the rationalist love polyamory. Though my impression is that usually it's exclusively straight guys with straight/bi women.
It was making the rounds on twitter GC’s, and I got shown a picture of the other guy. British public school toff. I didn’t go around looking for this one - it came to me. I honestly found these guys to be hostile to queerness that wasn’t poly tbh.
Also I just want to point out that if these kinds of rumors are reaching me, in the middle of nowhere, they're getting to other people.
Ooooooo 😯
> Because he’s apparently having sex with men at rationalist meetups while pretending to be a straight dude looking forwards to having a son. Its really weird. Didn't Scott date a transman (Ozy)? Edit: Yes I checked and this person identifies as AFAB, transman. Edit2: Scott refers to them as ex-girlfriend in this [post](https://astralcodexten.substack.com/p/theres-a-time-for-everyone) so I have no clue but other sourcing says the person themselves is AFAB, transman, they/them.