r/SneerClub archives
newest
bestest
longest
Rationalists take another small, tentative step towards disavowing their prophet (https://www.reddit.com/r/SneerClub/comments/10x17xb/rationalists_take_another_small_tentative_step/)
82

Original LessWrong post

Transitioning from a cult to a religion involves a challenging stage during which the original messianic prophet is somehow made to retire, to ultimately be replaced with a more enduring and insitutional means of organizing the group. This is a tricky process that includes extracting the most useful ideological elements from the otherwise incoherent medley of grandiose nonsense that constitutes the original prophet’s teachings. The prophet’s undesirable teachings must be discredited in a way that preserves, or even amplifies, their useful teachings.

One LessWronger makes a contribution to this effort on behalf of Rationalism by providing compelling evidence that Eliezer Yudkowsky does not know more about economics than all mainstream economists.

The economics under consideration involve monetary policy in Japan, of course. Much as crackpot physicists inevitably turn their attention to Einsteinian relativity or quantum superposition, crackpot economists are inevitably drawn to Japanese monetary policy, and EY is no exception.

The specific claim that EY makes is not particularly important; it essentially boils down to “economists believed X would happen, whereas Yudkowsky believed Y would happen, and in one particular instance Yudkowsky was proven to be right”.

But, as this LessWronger shows by using graphs from the U.S. Federal Reserve, Yudkowsky clearly was not right. They find this to be concerning because

…this error undermines a significant part of Yudkowsky’s thesis. This example was one of two major anecdotes that Yudkowsky presented to show that he can often know better than experts, and he cited it repeatedly throughout the book. Yet, I think he got it wrong.

Questioning the prophet is always a risky activity, though, so this LessWronger leaves the obvious implications unsaid. None of the commenters have attempted to defend Yudkowsky’s thesis as of the time of this writing.

One think that I think is worth noting is that this is only an incomplete revelation about EY’s failings as a great thinker in this instance. The thing is, even if Yudkowsky’s prediction had been correct, it still would not constitute compelling evidence that EY knows more about economics than economists do. Given a choice between “event X will happen” or “event Y will happen”, someone who knows nothing about the matter at hand can still make a correct prediction 50% of the time. Yudkowsky’s original reasoning about why a single correct prediction demonstrates his genius was always faulty.

I think that this particular revelation will be difficult for Rationalists to arrive at, though. If Rationalism has any ideological elements that are worth salvaging then its emphasis on mathematical thinking about uncertainty is presumably one of them, but it seems like it will be difficult for them to rectify that with a prophet who clearly does not understand basic probability.

EDIT: There’s also a twitter thread. Upon receiving even mild pushback, the author backtracks and reaffirms the strength of his faith:

I don’t think this error comes close to undermining the whole book. I think lots of people would still benefit from reading it, and I still think the main claims of the book are broadly true.

If you spend enough time around that shithole, you do actually start to notice more people beginning to, in so many words, call Yud a dipshit. Paul Christiano, for instance, has often recently been getting weird little jabs in, in only the way a rationalist can (obliquely and with far too much text).

No (direct) comment on the transition of Rationalists to religion (aside from hearty agreement that the process is common, and underway, and will likely lead to something like a religion going forward) but if you would like a fictional accounting of a techoshamantic belief structure focused on massive intelligences beyond human ken, I would like to recommend you The Broken God, by David Zindell, and the subsequent books if it strikes your fancy.

It’s wild to me that the Rationalists, especially this batch of them, are so interconnected with science fiction (Yud is a SF author who skipped the “writing books” stage and went straight to ‘believes his own premises’) and yet seem so deeply invested in reproducing cautionary plots from science fiction they probably should have read.

I'm convinced that Yud in particular simply has not read as many books as he likes to insinuate. Perhaps he was an avid reader in his teens/early twenties and then when he stated writing he found less and less time for it. There are areas in the sequences where he describes phenomena covered extensively in literature, oftentimes particularly in science fiction, as though he was discovering it for himself for the first time through his own self reflection. Given the propensity for pop culture reference within his writings he sure does miss a lot of glaringly obvious comparisons to incredibly well known Sci fi.
It's glaring enough that I honestly believe it's intentional to a degree. Yud does a good job of the cult leaders "approved writings" bit, and he put out so much shit that it's easy to never have any "outs" from your knowledge base to compare against. From what I've read of his fiction (a few short stories) he actually has a good sense for premises- Baby Eating Aliens, for example, has some interesting stuff going with how he sets up the aliens that are moderately novel. But, like Kilgore Trout, he can't execute with anything good because he's kind of a shit writer on the nuts and bolts. My suspicion is that he's familiar with a lot of the literature up to a certain date, because he wanted to be a science fiction writer until he found an easier grift. There's too many resonances with fiction that was overwhelmingly popular in-genre for him to have missed it. At the same time, I can see him genuinely believing he's come up with this shit while having been unconsciously influenced by it. The combination means he's recreating a lot of stuff, slapping a "rationalist fiction" label on it and calling it a day. I mean, look at this (likely self authored) introduction to his work: >Eliezer Yudkowsky helped kickstart the genre of rationalist fiction, which is about characters who solve the problems in their world by thinking, in a way where the reader could figure it out too. Like, this is mystery fiction, it's a century old! Arthur Conan Doyle famously had the same idea!
You seem to be talking about specifically fair-play whodunnit, where the readers get enough necessary details that we could reasonably get the correct conclusion, which to my best recolleection is not a characteristic of Sherlock Holmes stories. Yud's definitely not the first to do it, and I don't think hpmor is an excellent execution of the trope, but I'd argue it is still in general a good quality for writing to have, that the plot advances mostly based on details that were presented to the reader, or that the reader could possibly anticipate.
Oh I agree with it being a good thing. I've been a fan of the harder end of the SF spectrum since I was young, and I believe it's a really valuable genre in many ways, but suggesting that Yudkowsky is responsible for "kickstarting" a genre that's over a hundred years old and had many proponents over that time (which, to be fair, only Yud is doing) seems representative of a lot of the way LW works.
> Given the propensity for pop culture reference within his writings he sure does miss a lot of glaringly obvious comparisons to incredibly well known Sci fi. we're pretty sure he's never consumed any media aimed higher than middle school
If he read the original Fate/Stay Night visual novel like he claims, then he’s “technically” read something for a mature audience. Very, very technically.
Imagine looking at the career of L. Ron Hubbard and saying "You know what this guy's biggest mistake was? Writing books before he started to believe his own premises."

Given a choice between “event X will happen” or “event Y will happen”, someone who knows nothing about the matter at hand can still make a correct prediction 50% of the time. Yudkowsky’s original reasoning about why a single correct prediction demonstrates his genius was always faulty.

surely you’re not proposing that Rationalists should start thinking in terms of Bayesian statistics

It is genuinely incredible to me that there is a religious movement that is based in part on an incorrect understanding of basic math and, moreover, that some of its adherents even have a formal education in said math.
If someone's going to go to the trouble of making a math cult, they ought to do it as as hard as possible. Like, go full *Laundry Files* and start a cult where people believe that summoning horrific things from beyond space-time is an NP-hard problem, and that P=NP if only you have enough faith.
What is the incorrect understanding you're referring to?
The TLDR is that they all believe that Bayes' theorem, in and of itself, is a magic formula that always provides the best possible way of drawing inferences based on available information. This is not true. The precise details of their misunderstandings vary based on the person, but there are two mistakes that are really common. Maybe the most common mistake that they make is misunderstanding how Bayesian priors work. They all seem to believe that literally any state of knowledge whatsoever is an acceptable prior, and so they basically just make stuff up based on their feelings. What they don't understand is that a useful prior takes one of two forms: it is actual data derived from clearly defined, repeatable experiments, or it bears a mathematical relationship to the expected form of the posterior such that it makes computations easy to perform. The second most common mistake is that they never actually do any computations. Bayes' theorem can be useful, but only if you actually do the math. Most of the Rationalists don't know how to do the math, though, so again they just default to making stuff up based on their feelings. And because they don't know how to do the math they also don't realize that Bayes' theorem is often inappropriate to use in actual practice because it makes computation too difficult without any added benefit over alternatives. Edit: actually, I'll give you a bonus third mistake, 'cause it's a big one. They also have no idea how to contextualize the results of Bayesian inference. They assume that using Bayes' theorem always yields actionable insights, when instead the correct conclusion often takes the form of "we clearly don't know enough to draw any useful conclusions about the matter at hand".
This is weird, maybe I just never got enough into their writings to see these issues, but as someone who learned probability theory several times at different levels, I remember finding the explanation of Bayes theorem at LW to be really clarifying and accessible. What's the point of liking it so much if you're not going to actually use it? Do you mean they say that data points a certain direction for some specific issue, then never actually show a real calculation? I only read part of the sequences, some of them are well written and concise, but at a certain point they become a bit repetitive or bland, and I lost interest.
It's even less sophisticated than that; their appeals to Bayes' theorem often take the form of saying things like "my priors are X, and so I conclude Y", or things like "I'm updating my priors based on this new information". You can see them using this kind of language very frequently in online spaces. They claim to be using Bayes' theorem as cognitive tool in order to make decisions in everyday life, because they think that this is the most rational approach for making decisions. In actuality they're just reframing ordinary conversation in terms of misappropriated technical jargon. It's a good question: what's the point of learning math if you're not going to do it right? There's a lot going on here but I think that this is all fundamentally driven by the same emotional impulses that motivate any other religious movement. What they're seeking, above all else, is a feeling of certainty and control over their lives. A mathematical formula for understanding anything in the universe would obviously be very appealing for someone who feels that way, and through the power of motivated reasoning the Rationalists have convinced themselves that Bayes' theorem is just such a formula.

Last year Yud posted a big rant about how he had failed to stop AI and nobody else was smart enough to take over, and there were a lot of comments talking about how they had a lot less respect for him these days. Maybe people are looking for more reasons to doubt.

these are the ones who label as "post-rationalists", i.e. they realise the Sequences are embarrassing but still want to read EY's Harry Potter fanfic and do wordy racism with likeminded intellects

Much as crackpot physicists inevitably turn their attention to Einsteinian relativity or quantum superposition, crackpot economists are inevitably drawn to Japanese monetary policy

This is the first I’ve heard of that, anyone know why it is?

I might be exaggerating a bit there, but Japan is fertile ground for, ahem, contrarian economics takes because it's kind of eccentric. People especially talk a lot about their [lost decade](https://en.wikipedia.org/wiki/Lost_Decades) in the 90s. Uninformed physics discussions often involve someone piping in with "ah, but in quantum mechanics...", and it's never surprising when uninformed economics discussions veer into "ah, but what about Japan...". In both cases it's probably better to just leave things at "it's complicated and I don't really understand it".
>Uninformed physics discussions often involve someone piping in with "ah, but in quantum mechanics..." See also EY on quantum mechanics.
lots of crackpots *are* weebs...

I hadn’t heard of Inadequate Equilibria until today, but seeing as how it is endorsed by noted economist Bryan Caplan or the man who things sexism is ok so long as it is good business, I can’t wait to pick it up.

Imagine the Sequences without the interesting bits

Eliezer is driving an intellectual racecar when many are driving intellectual horse-and-buggies. Still needs to be vacuumed out from time to time though.

(From a comment on the LW post)

Does this come from genuine admiration, or from a desire to display obedience/adherence to the organization?

I think it makes these individuals feel smart and they desire the appearance of intelligence above anything of real substance. I see this across the rationalist and IDW spaces. They don’t have to actually understand, their allegiance is enough to establish their superhuman intelligence.

[deleted]

~~That's right, but I didn't bother to go into those details because it honestly looks even worse for Yudkowsky when outcomes are not equidistributed.~~ ~~50/50 odds give the~~ *~~minimum~~* ~~mutual information between an uninformed predictor and actual outcomes. If events are unevenly distributed then Yudkowsky, our uninformed predictor in this case, should be right or wrong more often than 50% of the time, which means being correct in a single instance becomes even less impressive than it already is.~~ N.B.: for those who aren't familiar with the information-theoretic perspective, being wrong about a binary prediction 100% of the time is just as impressive as being right about it 100% of the time, because you necessarily need to know the right answer in order to always give the wrong one (for certain specific technical definitions of the word "know"). EDIT: the above struck-out text is incorrect. I leave it there as a warning for posterity: don't talk about math without actually doing the math! Otherwise might end up like a rationalist, muttering nonsense about Bayes' theorem while you confidently make easily-avoided mistakes. Or you might end up like me, posting embarrassingly wrong things about basic math on the internet.
Suppose it's A with probability 5% and B with other 95%. You choose one with probability 50% and then roll those loaded dice. Simple exercise: what is the probably you picked the right one?
I don't see why? No matter what the weather is, if you guess uniformly randomly then you will get it right 50% of the time
If there's only two options, sure

Upon receiving even mild pushback, the author backtracks and reaffirms the strength of his faith

I’m not backtracking. I think the flaw was significant and noteworthy, and undermined part of his thesis. However, it’s also important to recognize that this one error does not render the entire book meritless.

Thanks for pointing out a "significant and noteworthy flaw," at any rate.
It seems like this kind of error should be pretty damning. Isn't the book fundamentally about how to reason and interpret evidence? How can someone who has an authoritative understanding of reasoning and evidence interpretation make such an obvious mistake when doing reasoning and evidence interpretation? Maybe the entire rest of the book consists of flawless intellectual gems, but, given Yudkowsky's penchant for overconfidently making incorrect assertions about subjects that he doesn't understand, that doesn't seem like it's the most plausible theory of the case.