r/SneerClub archives
"Stop dissing Bayes, it's a valuable tool if wielded properly. Just look at the difference between P(you're a blowhard) and P(you're a blowhard | you'd readily nuke your blog sooner than discuss it with the ethics board)" (https://twitter.com/grkraml/status/1276911740526178305)

IMO the issue with most people using Bayes is that they use it to draw attention away from the assumptions they make.


I want to give you the probability that you are an idiot, given that you are a rationalist, P(A|B).

This probability equals P(B|A) * P(A) / P(B), the probability that you are a rationalist given that you are an idiot times the probability that you are an idiot divided by the probability that you are a rationalist.

I’ve now turned a rhetorical question (the usually unserious probability that you are an idiot | rationalist) into a “scientific” looking problem which relies on three additional unknowns (P(B|A), P(B), and P(A)) that I’ll probably fill in with guesswork (or uncertain/inaccurate data from a pdf I found) if I’m your average Twitter intellectual. But since we get a nice “scientific” looking result, we can ignore that it was based on bunk to begin with.

Also, god forbid you ask them about the confidence intervals of their data and their conclusion.

Ive always viewed it as cool math, weird prior percents.

It’s brought up in Bostrom’s book, and it made a good case in that whole AI threat area.

But there is the Bo Winegard tweet:

“Would not a Bayesian assume, with caution and obviously without hate, that a particular stereotype is true unless contradicted by evidence? Should not our prior be that stereotypes are likely accurate?”

yeah, this is what Elizabeth Sandifer calls "literary Bayesianism" in *Neoreaction a Basilisk*. Their entire understanding of statistics is "Bayes good, frequentist *evil*." And they can't do the numbers for Bayes either. Because the numbers for Bayes are *hard*. Literary Bayesianism is so much easier than the kind with actual numbers in. When i was still reading LessWrong, I went "OK, this Bayes thing sounds way cool, I'll take the first obvious step and *read a book* or something, this is a recommended starter textbook ..." *(starts book)* "holy Jesus fuck what the" (these people have, of course, never done the reading either, beyond skimming the available PDF fragments of Jaynes) Like, *of course* priors are distributions, not single numbers. And not nice well-behaved distributions you have an *equation* for - no, they're weird lumpy irregular shit with normal bits in maybe. You're not talking about running a number through an equation - you're talking about running a couple of matrices through a transformation. Once you realise priors are lumpy distributions, literary "Bayesianism" promptly falls apart, as a word game that 0 of its perpetrators are using anything resembling numbers for when they say "I updated on that." Like most Yudkowskian advances, it's just a verbal sketch of what their claimed solution might look like. (Maybe I'm being unfair here, but even the [SEP description](https://plato.stanford.edu/entries/epistemology-bayesian/) of Bayesian epistemology reads like a verbal sketch of how it might work in a world where humans could do that, rather than a thing you could actually do.) I have had rationalists tell me that pulling a number out of their arse and running it through Bayes makes it somehow more reliable than just pulling it out of their arse. Handmade bespoke artisanal bias laundering. Garbage in, LessWrong out. How Bayesian epistemology works in practice: * I have Bayesian priors * YOU have cognitive biases * THEY are toxoplasmotic SJW filth I dunno. Tell me I'm being unfair, and what I missed. I understand CFAR teach how to do "Bayes" in real life, in real time. Is there a good link available to what the fuck they actually perpetrate under this name?
I don't get the hype about Bayes. Sure, it is useful. For example, it tells you that Covid antibody tests with a single digit percentage of false positives (edit: out of a given number of negative samples) will give lousy results when only a small percentage of the population actually had the disease. But isn't this obvious to anyone with a basic training in statistics? What's so "magical" about it, why do some people who understand it consider themselves geniuses, and why do they believe they could "fix" things with it?
Correct, yeah, the weird thing about that one is that it's not really "Bayesian statistics", it's just an application of Bayes' rule, which is what a frequentist or whatever would do, anyway. It is very annoying to me to see rationalists always trot out things like that as a way to introduce people to "Bayesianism". One problem is that there's a lot of equivocation about what's going on here because of some overlapping ideas. Bayesian methods in statistics are highly useful but don't necessitate any type of "Bayesian epistemology" at least as a model of rationality and absolutely nobody in statistics takes "fully subjective Bayesianism" seriously.
The legitimate use of priors and Bayesian stats is essentially to provide soft constraints that can regularize otherwise noisy or underspecified inference problems. You can take past data that you trust, come up with a simple prior outlining its overall shape, and then run your analysis on top of it, similar to running an analysis on all the data together. Even very weak priors can sometimes make noticeable improvements. Of course any version of Bayesian statistics that you can actually use won't work very well if you plug in your gut feelings for priors and only use a couple data points to justify the slightly nudged posterior.
The "magic" is that Bayes makes that seem obvious. It's like the Pythagorean theorem. It seems obvious once you understand it, for the complex problems it's not the solution on its own, and the people who used "Pythagorean" as an identity had beans for brains.
some libertarians apparently thought they could use it to get BS past people or something, i dunno
Reminds me of people who find out that public key cryptography is a thing, and who immediately conclude that blockchains will revolutionize the world.
[I have a post on this shit.](https://davidgerard.co.uk/blockchain/2019/04/29/what-ordinary-people-think-a-blockchain-is-and-the-weasel-term-blockchain-technology/) (Which you will of course be familiar with.) Favourite bit: the Web 3.0 dude insisting to my face that Certificate Transparency was a blockchain, when the guy who invented CT specifically said it wasn't and why. No, I didn't check if Web 3.0 dude was into SSC as well.
From that, they don't seem to! The total "Bayes" content is a Wikipedia summary of Aumann's Agreement Theorem, and definitions of "Bayesian Updating" and "Prior". Not a number to be seen.
oh jesus the 249pp Dragon Army manual


This is Chris Stucchio's entire gimmick. He's the guy who graced the internet with posts such as [Which is more dangerous, guns or gay sex?](https://www.chrisstucchio.com/blog/2016/are_gays_or_guns_more_dangerous.html) and [Why Can't Gay Men Donate Blood? A Bayesian Analysis](https://www.chrisstucchio.com/blog/2016/why_gays_cant_donate_blood.html), and whose takes are too spicy for even HN.
> However, I’ve found the following information which we can use to compute a rough guesstimate and > According to the Wikipedia article clearly form the basis of a sound statistical argument (Never mind that in the second link, they manage to be off by almost a factor of two in “guesstimating” P(HIV|gay)) Bayes’ theorem is very sensitive (specifically to small or uncertain probabilities), so it’s horrible statistics to use rough data with it, especially when those data have unknown certainty.
> almost a factor of two So they count 2 people more as having HIV than actually had HIV? That doesn't sound that bad to me, it is only 2 people and there are millions of gay people, I don't see the objection. :D Btw, he also seems to count 'gay people who have HIV' and not 'gay people who are unaware that they have HIV', at least from the text. It is annoying how often they go 'Bayesian Analysis' when they just mean 'lets calculate the probability'(*). The latter sounds less special of course. Rationalists should do the 'stop saying Bayes 2020' challenge, and introduce a Bayes swear jar (for charity!). *: the concept of 'this probability expresses the degree in which I think this might be true' is pretty implicit from the way the probability value is used.
I remember reading a nice criticism of how Bayes is used by uwu deep rationalists and its limitations by a probabilist's web page but I can't find it anymore :(
[The Cult Of Bayes](http://web.archive.org/web/20140221090455/http://plover.net/~bonds/cultofbayes.html)?
Good read but what I was thinking about was rather technical.

Bayes as done by statisticians: good shit if done well

Bayes as done by anybody else: what the fuck this is a statistical thing, are you going to do fucking Gibbs sampling in your head?