r/SneerClub archives
newest
bestest
longest
74

https://web.archive.org/web/20220202193959/https://astralcodexten.substack.com/p/why-do-i-suck

I think it passed by without remark a few months ago, but Scott says out loud something that, though obvious, really does put the pin in the bubble of his “rationality”.

Second: I hate conforming. Hate hate hate it. As Mencken said, “it’s not worth an intelligent person’s time to be in the majority, by definition there are already enough people to do that.” Expressing a majority viewpoint feels like punching down, or like kicking an underdog. I’ll do it if I have to, because you should still defend the truth even when it’s popular, but I don’t enjoy it.

Now, Yogi Berra famously (and apocryphally) summed up the iconoclastic misanthropic streak in all of us with the oft-quoted “Nobody goes there anymore. It’s too crowded.” It’s hardly unique to Mr. Codex to have this streak. Although, it tends to be quite a bit stronger in anyone, like myself, who is neuro-diverse in a way that makes navigating the social landscape a mystifying chore.

But, to celebrate it as a valid reason to form an opinion, or to avoid arguing in favor of something, is quite the failure for someone who professes to want to believe more true things and less false ones. You’d think that he might rethink whether he actually believes what he thinks he does, but …

I realize it’s self-serving to write a post on why you suck and transition to “maybe I’m just too good for everyone”.

Ah Scott, you’ve probably always deeply wanted to believe this. You may have spent your whole adult life trying to convince yourself of it. One both despises the crowd and craves its adulation.

one weird trick: All scott has to do is convince himself that his position is marginalized and beleaguered, even if it isn’t. Then he gets the benefit of the underdog morale buff, even while arguing on behalf on entrenched power structures. next level power gaming strat

>next level power gaming strat I mean, I guess it was next level power gaming at the times of Nixon. Now it is something more akin to "do you seriously expect people to fall for this? .... Wait, it is working?! Again?!"
As Jon Bois once said, every Goliath wants to be David
This would be way better if David Boies said it

This is one of the best sneers I’ve seen on here recently. I know the tendency and purpose of the sub is to, well, sneer but IMO the best critiques of online rationalism here are pointing out when rationalists brazenly fail by their own self-proclaimed standards. Scott’s transparent straw-manning during the recent post on “Justice” was a classic case of not noticing “skulls” too.

> pointing out when rationalists brazenly fail by their own self-proclaimed standards As /u/finfinfin once pointed out, the highest form of the sneer is a simple quote.
> The pure examples of Sneer are unintentional; they are dead serious. The Rationalist blogger who debates an imaginary feminist with bad arguments is not kidding, nor is he trying to be charming. He is saying, in all earnestness: Voilà! the Steelman! > ... > In naïve, or pure, Sneer, the essential element is seriousness, a seriousness that fails. Of course, not all seriousness that fails can be redeemed as Sneer. Only that which has the proper mixture of the exaggerated, the fantastic, the passionate, and the naïve. \- Sontag, Notes on Sneer

First: in basically every other way, I am an extremely unfashionable person. But in this case, somehow I ended up near the top of the barberpole model of fashion. I felt like all my friends were social justice warriors, back when other people described barely knowing one or two. So I got annoyed with them early and rebelled against them early.

Lol wtf. What kind of grown-ass man acts all rebellious and contrarian towards his friends? It never occurred to you to extend them some actual charity and empathy to your friends to find out why they think they way they do? They’re your friends. This is straight up teenager bullshit at best.

And that means my natural I-hate-saying-whatever-the-majority-says kicks in whenever I’m tempted to criticize wokeness. I could write about something something critical race theory in school.

You are literally a republican. Only braindead republicans think the creeping conspiracy of critical race theory is even a thing. Did it even occur to you to actually check if this was true? Because it isn’t. If you extend the concept of rationality to actually looking up facts you’d know that.

Who’s woke anymore? Are there really still woke people? Other than all corporations, every government agency, and all media properties, I mean.

excuse me what

It's like watching someones brain atrophy from conservative propaganda in real time

[deleted]

Can I quote you on that?
You can quote me all you like, just don’t tell anybody
90% of misquoting is half-mental.
It is fun to see yall are Yogi bear fans.
"I never said an apocryphal word in my damn life" - Yogi Berra
“I really didn’t say everything I said” - Yogi Berra (actually did say that)

Can anybody recommended any good skeptical writing into the real undercurrents of rationalism like this Aeon essay about “longtermism”?: https://aeon.co/essays/why-longtermism-is-the-worlds-most-dangerous-secular-credo

I was kind of shocked when I read that because I don’t think it even mentions Scott but he has capped off multiple essays with that exact crazy hypothetical “humanity living for a trillion zillion years as an intergalactic trans human species” and I thought it was just a slightly zany exercise in following a line of logic to trippiest conclusions but to realize that’s a really foundational belief of “Rationalists” is really something else.

I can't give you examples of longer-form, more serious, skeptical writings on capital R Rationalism, but as to the specific undercurrents you mention, it basically just comes straight from Less Wrong in general, Elizier Yudkowsky's Sequences more specifically, and the emphasis on Bayes Theorem as a guidepost for every decision. Once you go down the path of attempting to put a number on every decision, using estimates of probabilities as well as estimates of benefits, then very, very large benefit numbers immediately pop out as the "best" outcomes no matter how unlikely or how far away. Yudkowsky (I believe) even came up with his own name for and thought-example of this problem called "Pascal's Mugging" And yet, despite understanding this issue, they remain undaunted in favoring any possible outcome that can be described with an essentially infinitely large number representing "good". As to \*why\* this appeals so much, see that previous point about misanthropic iconoclasm. I'd put my money on that. They love the idea of people writ large, but not so much the actual ones.
Thanks for the insight. I guess I just wish there was some place that clearly aggregated and plainly explained the controversies about the more insidious aspects of Less Wrong /"Rationalism" for a layperson that hasn't been steeped in these communities for years. All of the critical perspectives seem buried in old comment threads and posts and its presented in such an esoteric and inside baseball way. Even the New York Times article was pretty vague and didn't even touch on the creepiest stuff. I literally would have no idea on a basic level what this sub was even about if I hadn't been casually reading SSC for a few years. I'm just now discovering the Kathy Forth stuff and its just been disorienting.
RationalWiki does a fair bit of this. Still some jargon but aimed at a more lay audience.
*Neoreaction a Basilisk: Essays on and around the alt-right* by Elizabeth Sandifer is quite good. Its only partly about rationalism proper (discussions about Yudkowsky's early works, mostly), spending most of its time on the more radical descendants of the movement (Nick Land and Moldbug in particular), but its still the best long-form academic critique of this mindset I'm aware of.
Thanks, exactly the kind of thing I was looking for.

Someone less into machine-learning metaphors and more into leftism than I am (20-year-old me could easily have gone down that road!) might say-

Ah yes, machine learning metaphors. Such a deep well of wisdom. Computers have so much to teach us

Paul Cockshott tried to solve everything with computers and linear algebra before it was cool.
matrix multiplication and socialism are incompatible actually

As a ‘I’m contrarian so I can feel smugly superior’ contrarian, I have to say that he is doing contrarianism in a contrarian way.

(I joke, but I did have a sort of phase like this, which together with an interest in transhumanism etc eventually led me to Scott, who made me see the error of my ways).

“I hate conforming. Hate hate hate it. All you people who have agreed on the meaning of English words can go frzzlebump jha ullr weq-lurprot.”

Just wanted to non-sneeringly say that “despise the crowd but crave its adulation” is a very pretty sentence.

This bit has been coming to mind a lot recently:

https://youtu.be/o2AMojXb1lc

HL Mencken

JF Christ