r/SneerClub archives
newest
bestest
longest
The Rationalist: can be replaced with vacuous advice and anecdotes from a third-rate college zine for disaffected STEMlords (https://astralcodexten.substack.com/p/heuristics-that-almost-always-work?utm_campaign=post&utm_medium=web&utm_source=direct)
48

Scott really needs to learn the Rule of Three.

A bit tinfoil, but I have a theory: The third thing is the entire point of the essay and he realized that would sound like he was whining about people who were mean to him online about how some of his techno-futurism was hilariously wrong, so he tacked on some other stuff to disguise it.
He often uses this rhetorical tactic of starting his posts with unobjectionable platitudes and then inserting *his entire point* in one line of snark somewhere in the middle.
In this case the platitudes are not even unobjectionable. As long as the insurance company and the perspective robbers don't know that the guard actually never checks, he can extract a rent from asymmetric information, something that the rock could never do. Like, you can palpably feel the stupid from the start. However my favorite moment was the one of the physician. If Siskind's idea of what happens, *in the US of all places*, to physician that fails to diagnose cancer (or just seems to fail) is "but then the patient die and she is remembered fondly", oh boy I hope Thiel is ready to invest a lot on him, because his ass is getting sued.
Yeah I think I take issue with every single one of his rock comparisons for one reason or another. The futurist one is particularly funny because you can see where the butthurt lies but it's honestly impressive how many inane comments he managed to fit into a short essay... and how he managed to make such a short essay feel like such a chore to make it through.
Is it only me or he is becoming sloppy at his game? Like, posts like "I can tolerate anything" had the idiotic point diffused among all the wrong examples, sometimes living only in the subtext, and so on. But this is more "unrelated example 1, ..., unrelated example with the idiotic point very clearly spelled, ...."
Yeah... I think he was probably a reasonably intelligent (if obnoxious) person before he cultivated the validation gang he currently has. I think for some people even minor fame, and certainly the wrong audience, just kinda rots their brain. Imo barring some grand personal revelation and a break from blogging it's only likely to get worse. I don't follow his content religiously so maybe I missed something, but from what I've seen of him this is a low water mark that follows still more recent low water marks.
This actually happened to a close relative of mine. Just as you described, in the US.
First, I am really sorry that that happened to your relative. But my point was not "it does not happen", but rather that the idea of the negligent and mentally lazy doctor living happily whilst not actually doing her job, or generally the fact that our society is completely enthralled by rock-users and there is no counterbalance for them, is ludicrous. In the specific case of MDs, we threaten doctors who don't do their job properly with damages in the order of millions and potentially jail. Does this eliminate any possible error? No, people can make good-faith errors, or even being consciously slackers because no deterrence is perfect. Still, Western society, especially a society more, uh, reliant on the court system such as the US, does *a lot* of effort to disincentive that kind of behavior. One could even argue that this kind of effort is so strong that it ends up being suboptimal in the other direction: a member of my family had cancer, and that member of my family had to fight hard to get a treatment that despite being described as the best by a second (and third, and fourth) opinion, to the common joe looks less invasive and more "hands-off". We are disincentivizing slackers so much that hospitals prefer to do what looks like doing much than doing well. To make another example, in my native country there was a public outrage over the fact that geologists did not predict an earthquake. Which is obviously bonkers: *you cannot predict earthquakes.* Still, the geologists lost their job for good, and had to defend their obvious innocence in a court of law just to keep their freedom. In such a situations, the incentives are very clear: just give false alarms all the time, so nobody thinks you are negligent. Does not matter if you are basically lying to people, people clearly prefer to be lied by somebody who seems like they care than receiving an honest answer that the likes of Siskind interpret as "just consulting a rock". And that's why I regularly find his posts so out of touch that an Ivy professor who refuses to listen to anything but classical music jazz and who reads only critical theory seems full of street credibility by comparison. In the real world, it's people like him who actually cause problem. It's the dumb "experts are just consulting rock" stuff who opens witch hunts that not even the most vittimistic IDW type could dream of. But I guess it's a sacrifice his willing to pay in order to have a witty retort to anybody who in his tortured imagination might use facts and data to make him look dumb.
As in, doctor failed to diagnose your relative? Same here actually, and no legal action was pursued because of the effort. I agree with the commenter's main point that that happening is not a safe bet for a doctor here though (though, like most things, the doctor is probably in a safer place to be bad at their job when servicing poorer or more marginalized areas).
This is absolutely it I reckon

Lisa, I would like to buy your rock.

Longer sneer: I get it. It’s not about being “less wrong”. If it were, you’d just follow the rock.

It’s about getting that 0.1%. That ability to say “I told you so”. Not only were you right, everyone else was wrong. That is what is most important.

You have studied the ancient mathematics. You have swam in the deepest oceans of physics. You have wrestled with the most improbable propositions, and cancelled infinities, and at long last you return from your heroic voyage with a perpetual motion machine and a fraction that equals pi. And now some unlearned schoolboy dares to laugh at your accomplishments.

It’s embarassing being outperformed by a rock.

What you did not realize is that rock is actually is the combined experiences of millions of intelligent people that came before you. That is why few people ever beat the rock; those that do have made their name in history. I know you too secretly desire to be counted among them, but until you do better than the rock, you will always just be another foolish contrarian.

Shit, the real sneer is in the comments: >There is also the Heuristic That Almost Never Works, where you take an annoyingly contrarian position on literally everything until purely by chance you hit one out of the park and are feted as a courageous genius. Then you proceed to be wrong about everything else for the rest of your life, but no one will have the courage to contradict you. This is also an attractive strategy, to some people at least.
Oh, so that's what Ruling Thinkers In, Not Out means

I had a hard time reading the whole thing after the first example because I was sitting there thinking “isn’t this ignoring one of the major reasons to have a security guard, i.e. to be a big visible presence of security that can deter threats just by being there”?

His doctor example also completely ignores the placebo effect, which is weird because that’s typically one of the rationalist dead horses.

Actually, putting it all together, I think that his examples are actually great at making the point he didn’t know he was making. We can see from these examples that Rationalists are people who try to be contrarian by saying “everyone else is wrong but me” (with a cynical dose of of “just like me, everyone else is lazy and trying to manipulate others”), but don’t actually want to do the actual research to try and figure out why other people think or act the way they do.

For being a cult based around statistics, they need quite a lot of words to explain (badly and incompletely) the concept of False Positive Rate. Also seem to be unaware of the fact that learning how to avoid negatives to drown the model is something that is explained even in very basics data science, AI or statistics courses.

So I propose: the Rationalist. He is convinced that anybody but Yud and Siskind are wrong on anything, so it never bothers to check on what they are writing. This is an Almost Always Wrong Heuristic, but gets a lot of upvotes on ACX 99% of the time. So he never checks. Only that when he dies and the basilisk resurrect him in the torture simulation, the basilisk finds that his entire mind consists only of the code “cat the Sequences”

#ABSOLUTELY PERFECT

The opening story already fails. Not only is there the ‘lose your job if the head office hires a red team’ incentive, it also has a weird childlike view of security people. No electronics, no patrols, good example of backwards reasoning. (Security also helps with non robber threats as well (and sometimes just by existing, they don’t have to even do things)). (Btw, don’t confuse me for saying that Scotts example doesn’t ever happen btw, it just is a subset of the security people, and it is a weird concern)

Same with the doc story of course, it doesn’t factor in that patients are human beings who will not just take ‘take 2 painkillers’ as an answer. (All the other examples have the same problem, people will not just sit around and do nothing, futurologists have arguments (Hell, even lesswrong has ‘the AGI foom thing will not happen, here is why’ people), scientists base their arguments on research (there would be no real vulcanologist if the island has only ever seen one volcano). And the reason people do this is just because the heuristic can be wrong and people in the fields usually already know this, wait somebody wrote a blog post about this look, it was you!

[deleted]

Well he is a NRx fanboy, so the queen can only be replaced by either a tech CEO or an AGI.
Yeah, I was expecting the same, that the end would be about these long tail risks being worth ignoring.

How to pad your writing with folksy anecdotes that you made up.

Thankfully, seems like the community also thinks this one is a turd, so he’s only fooling himself.

What I find amazing is that, even though they all seem to realize it's bullshit, they also keep qualifying that statement with how well written it is... I just have no clue how that could possibly be a joy to read unless you're sneering.

I’m enjoying the stark contrast between this and You Are Still Crying Wolf

Classic Scott: the one header label that doesn’t fit his self-made tale is the common conservative being called a “futurist”.