r/SneerClub archives
newest
bestest
longest
A new recruit to the phyg trepidatiously posits the same essay that's run occasionally on LessWrong for the last ten years: why are all you people so bad at things in the real world? (https://www.greaterwrong.com/posts/bRGbdG58cJ8RGjS5G/no-really-why-aren-t-rationalists-winning)
24

His answer is to imitate actually-successful people’s self-improvement programmes and gain actual success. This has historically led to LWers realising LW is pants and leaving. Perhaps he will leave a goodbye message when he does so.

Comments are amusing handwringing over the community’s collective ineptitude.

and oh god:

The winningest rationalist I know of is Dominic Cummings, who was the lead strategist behind the Brexit pro-leave movement. While the majority of LWers may not agree with his goals, he did seem to be effective, and he frequently makes references to rationalist concepts (including IIRC some references to the work of Eliezer Yudkowsky) on his blog: https://​​dominiccummings.com/​​

[deleted]
[Also a Scott fan](https://dominiccummings.com/2018/09/11/29-on-the-referendum-4c-on-expertise-on-the-arpa-parc-dream-machine-science-funding-high-performance-and-uk-national-strategy/), you'll be utterly unshocked to find.
> self-publishing over a hundred pages of self-invented jargon and poorly understood stats Clearly a rationalist, checks out
A master of instrumental, as opposed to epistemic, rationality. He not only *did* something, he abandoned the truth to do it. He figured out that people would vote for Brexit if it was for a "nice" reason, and so he came up with lie about the 350million for the NHS
that and a few million quid overspend and
>He not only did something, he abandoned the truth to do it. My issue with the bloody Kantians and the bloody virtue people is that they're always wrong for the right reasons. Just using instrumental rationality is often bad because it's a failure of reason, and reason is an interesting phenomenon in that it often seems to somehow get things right in spite of itself, but instrumental rationality is interesting because it often seems to be the thing that ends up getting things most right most of all in spite of itself (cf. Feyerabend). Embrace the paradox! Just stop using instrumental rationality to tell obvious lies in order to get your way. (not a personal attack, of course, since I know nothing about you or your views on Kant or virtue)
You cannot deny that helping to jepeordize the economic future of an entire nation for the sake of political power using lies and misinformation is, in fact, *something*.
> is, in fact, something. More impactful than their AI research... for now... [laughs forever in Roko's basilisk's tongue]

Idk the vibe I get from a lot of these types is like the movie Nightcrawler; on paper their positions are clear and articulated and “correct” but there is a coldness to much of it that hints to a lack of empathy beneath the surface.

In my experience it isn’t hard to talk a rationalist into arguing for organized slaughter so long as you play their language game, at no point does some empathy shine through and they go “wait no I don’t want a lot of people to die just because it’s ‘correct’” let alone “hmm maybe I’m wrong if this is the conclusion I reached?” As long as it’s argued correctly, most users (at least that I’ve dealt with) don’t actually care about the bodies that would have to be piled up.

This low empathy is most visible when users talk about their approach to dating and gender relations, which for normal people is just an empathy game, which is why they seem completely lost at basic dating that isn’t purely transactional. Imo why there is a hyper prevalence of both contract-style polygamous and hyper-traditional daters, but few that can handle more common, more ambiguous forms of dating like hook-up culture or “seeing” somebody. Basically if the terms of the trade aren’t laid out clearly at launch, they struggle, because they have difficulty feeling out their partner’s attitude toward them.

>there is a coldness to much of it that hints to a lack of empathy beneath the surface. Have you read any of Scott Aaronson's blog? It's hard not to take away the impression he views non-nerds as incomprehensible threats, something to be placated but never understood. I get the same vibe from him as I get from the Alt-Right "NPC meme" pics, the message of which always seems to be: "people who aren't like me aren't *real* people, they're just automatons following a dialogue tree. So they don't have any moral value and I don't have to extend them any empathy".
Idk I liked the NpC meme, it's a shame the right got ahold of it first because in the right hands it could have been a powerful critique against capitalist monoculture. Instead now we're stuck defending the fact that our culture is only allowed to sustain 3 different movies a year
The Right *invented* the NPC meme. It came straight off of the right wing fever swamps of /pol/ and /r9k/.
No they were the first to package the idea in a memeable format but you can chart the general argument back to Neitzche's *Last Man* at least
I like you, but *man*, do I hate that fucking NPC meme. That sort of smug, misanthropic narcissism really only works for teenagers (and George Carlin, *sometimes*). Most people are doing the best they can under difficult circumstances. They need compassion, solidarity, and probably a glass of water. They don't need some dorks telling them they're subhumans.
And they also don't need the same type of dork telling them their felt alienation is just totally naturally and it's a-ok that we live in a culture where people *are* functionally treated like NpCs. The delivery mechanism is shit because it came from the right but the idea is not, alienation and mass culture is real and it's one of the poisons destroying the world
Sure, but if your punchline is "some/most people are soulless drones," you're not fighting that; you're joining in.
So? It amuses me greatly that people live their whole lives like that, so I'm gonna laugh when I see it.
Social Justice NPC shirts on their way, from Arkadian Dreams!!
> In my experience it isn't hard to talk a rationalist into arguing for organized slaughter so long as you play their language game, at no point does some empathy shine through and they go "wait no I don't want a lot of people to die just because it's 'correct'" let alone "hmm maybe I'm wrong if this is the conclusion I reached?" As long as it's argued correctly, most users (at least that I've dealt with) don't actually care about the bodies that would have to be piled up. Pretty much all philosophical generalisations can be stretched until they either break or just run out of useful things to say. Most people will go "yes, obviously it breaks if you stretch it to ridiculousness," but rationalists go "THIS REMARKABLE RESULT IS IMPORTANT!!" It's like nobody ever told rationalists "don't fall in love with your model."
When they say "shut up and calculate", all I hear is "Don't check that your model makes sense in edge cases"
Do rationalists say “shut up and calculate”, though? I’ve only heard that attributed, within the rationalist community, to subscribers to the Copenhagen interpretation. Rationalists, keep in mind, tend to disagree vehemently with the Copenhagen interpretation. It’s always possible that somebody’s adopted a term from their outgroup, but I think it’s more likely that you misinterpreted a reference as an endorsement. Edit: Wait, never mind. Retracted. They do say it often in the context of morality.
yeah, the actual quote is "[shut up and multiply](https://wiki.lesswrong.com/wiki/Shut_up_and_multiply)" but it's the same idea. ​
> which for normal people is just an empathy game Oh wow, what a nice definition. Just takes me back to "[males don't connect, females don't connect, and back and forth](https://thelastpsychiatrist.com/2009/10/you_want_to_be_don_draper_you.html#comment-6024)"
> Basically if the terms of the trade aren't laid out clearly at launch, they struggle, because they have difficulty feeling out their partner's attitude toward them. In a directly opposite sense, imagine there was a whole subculture of people who couldn't rely on hundreds of years of cultural expectations to tell them what an 'acceptable' relationship looks like. And yet they succeeded through pure human-human contact. To quote a regular sneerer: ["there's a decent argument to be made that cishets lack the moral character to sucessfully do polyamory."](https://www.reddit.com/r/SneerClub/comments/9wmwfg/alicorn_is_mostly_known_for_being_responsible_for/e9mwtml/)
I really don't have anything against polygamists per say, people can date however it's not my business. That said, I also think some people are attracted to polygamy because it allows them to avoid liability for low emotional intelligence or selfish behaviour
I can assure you that polyamorist shitheadry is very like cishet shitheadry except, somehow, just that bit worse for its failure to be better
Yeah it turns out people are shit regardless of their sexuality
As the rationalist proverb goes, "name three examples." Or even one! Or even just describe *what* argument is needed to convince most rationalists that genocide is good. This just reminds me of Yudkowsky's "trust me I definitely convinced people to let my AI out of the box, but you can't see how because reasons". (Also, ethicality, mirroring-emotions, and ability-to-read-emotions are both called empathy and I think that's lumping many different things together)
Well here's the thread that drove me from the board: https://www.reddit.com/r/slatestarcodex/comments/9sabky/culture_war_roundup_for_the_week_of_october_29/e8rl7su/?context=3 Wherein a user told me that *somebody* has to die so the argument between fascism and not-fascism just boils down to articulating why your side shouldn't be the one to die and that arguing for policy where we actually try not to kill people is utopian. So there's an example
negative utilitarianism
Eh. That just weights pain *more*, not exclusively. Additionally, "my friend just died" is usually stronger than "I'm a person going about my somewhat sad life," so the genocide is likely to cause more pain than it prevents. What I mean is - yes, some hypothetical negative utilitarians would support some form of genocide. It's just a belated anti-natalist, in a sense. But that doesn't mean that any *real* rationalists could be *persuaded* to support genocide with a negative-utilitarian argument. (Consider that most rationalists consider the most likely outcome of an Un-Friendly Artificial General Intelligence to be "we all die quickly", and additionally consider that to be *bad.*) (And you have to think about ratios. Could some hypothetical non-(negative utilitarians) be in favor of genocide? Certainly. Is a random hypothetical negative utilitarian *much more likely than average* to support genocide? Maybe, but far from obvious.) TL;DR: "in my experience it isn't hard to talk a rationalist into arguing for organized slaughter" implies examples of *persuading rationalists*, not examples of *hypothetical rationalists*; your example is too weak.
> But that doesn't mean that any real rationalists could be persuaded to support genocide with a negative-utilitarian argument. Have you talked to like 40% of the LessWronger-EAs in continental europe?
That's a separate argument. My point is that "a hypothetical negative utilitarian supports some form of genocide" doesn't logically imply "a significant number of rationalists, starting from a non-genocide-supporting position, can be persuaded to support a realistic form of genocide, only by a human presenting logical arguments to them."

This comment is the rationalist version of “my girlfriend who lives in Canada.” And yet the most successful person you can name is someone who invested in bitcoin early and had the foresight not to sell it too soon because … checks notes … they died. But they would have been successful, definitely, and maybe a senior government official too. He would have gone to Canada in the summers to hang out at their bitcoin mansion.