r/SneerClub archives
newest
bestest
longest
A rationalist wonders: Where are all the successful rationalists? (https://applieddivinitystudies.com/2020/09/05/rationality-winning/)
81

Talking to rationalists is not much better, since it feels less like a free exchange of ideas, and more like an exchange of “have you read post?”

Because rationalism has taught y’all such amazing independent cognitive skills.

Yeah when the second sentence is “you HAVE to read the sequences so we can talk properly” and it turns out it’s tens of thousands, hundreds? Of words which are obtuse and boring... Yeah. That’s quite the shared language there.
After some time it dawned on me that anyone applying the skills discussed in the sequences should put down the sequences well before they're done reading them.

Before reading A Human’s Guide to Words and The Categories Were Made For Man, I went around thinking “oh god, no one is using language coherently, and I seem to be the only one seeing it, but I cannot even express my horror in a comprehensible way.” This felt like a hellish combination of being trapped in an illusion, questioning my own sanity, and simultaneously being unable to scream. For years, I wondered if I was just uniquely broken, and living in a reality that no one else seemed to see or understand.

The mind boggles at the fact that this guy was driven to question his own sanity by the idea of “the way most people prescriptively talk about language doesn’t really have any scientific or logical backing” yet somehow never did enough research on it to encounter any linguistics writing, where this is pretty much the mainstream viewpoint. And then he thinks the rationalists are the only ones who talk about this idea? Another example of “the new stuff is not good / the good stuff is not new,” I guess.

“We’re the first to discover and talk about X so long as you completely ignore that centuries old academic tradition that is completely dedicated to researching and discussing it” is very on brand for rationalists.
Also note the telegraphed illusion to Harlan Ellison, a great sci-fi writer no doubt, but just such a stereotypical reference for someone in rationalism to make without knowing anything about linguistics
Language games and beetle in a box would probably make their heads explode.

[deleted]

> Why waste time say lot word when few word do trick? [Hear!]^2
H E A R E A A E R A E H
Heareaareah is no joke. Kids, if you or someone you know is listening to the Sam Harris podcast on the shitter, make sure you tell a trusted adult, write to your local representative, or throw a brick through a plate glass window.
Not very empathetic of you!
[deleted]
Ok then!
[deleted]
And you, my friend!

Whatever criticism one makes of rationalists, they have likely made of themselves many times over.

Lol

It’s solely because weird blogs on the internet make me feel less alone.

:(

E: Perhaps you would feel less weirdly alone if you didnt personally identify with tech billionaires and introduced some class consciousness in your life. (It is interesting just how blind these analysises are to class)

As if LWers were capable of criticizing themselves.

Hey, what about dominic cummings? He was very successful at getting tens of thousands of people killed!

Hey, I was trying to drink my coffee, not spit it all over my keyboard
Your mistake for drinking and reading this sub.
What?? I thought rationalists invented the idea of wearing masks and saved millions!

it was a spelling error - rationality is systematised whining

the author also posted another take in which they wonder if effective altruism and rationalism never became popular because they are innate:

One last piece of anecdotal evidence: Despite repeated attempts, I have never been able to “convert” anyone to effective altruism. Not even close. I’ve gotten friends to agree with me on every subpoint, but still fail to sell them on the concept as a whole. These are precisely the kinds of nerdy and compassionate people you might expect to be interested, but they just aren’t. [5]

In comparison, I remember my own experience taking to effective altruism the way a fish takes to water. When I first read Peter Singer, I thought “yes, obviously we should save the drowning child.” When I heard about existential risk, I thought “yes, obvious we should be concerned about the far future”. This didn’t take slogging through hours of blog posts or books, it just made sense. [6]

Some people don’t seem to have that reaction at all, and I don’t think it’s a failure of empathy or cognitive ability. Somehow it just doesn’t take.

While there does seem to be something missing, I can’t express what it is. When I say “innate”, I don’t mean it’s true from birth. It could be the result of a specific formative moment, or an eclectic series of life experiences. Or some combination of all of the above.

Fortunately, we can at least start to figure this out through recollection and introspection. If you consider yourself an effective altruist, a rationalist or anything adjacent, please email me about your own experience. Did Yudkowsky convert you? Was reading LessWrong a grand revelation? Was the real rationalism deep inside of you all along? I want to know.

Yes, obviously we should save the drowning child. Yes, obviously we should be concerned about the far future (although we probably won't have a far future if we don't tackle climate change.) Somehow A+B just doesn't add up to "Yes, obviously we should give our money to preventing a dubiously realistic science fiction dystopia, even though it's not at all obvious that any currently existing organization can or will do much to prevent a dubiously realistic science fiction dystopia."
Yeah I think the missing piece is that... people deep into these causes look like suckers. “Give all my money to xrisk orgs specifically miri which you work for all your friends work for and you live in a group house full of employees” The cult factor keeps people away. Plus the fact that the deliverables are so obtuse... it seems like a poor use of money.
It's the Sea Org, but without boats
And without the 'cool' compound in the hills above LA. Without the celebrities. To quote a certain movie, "say what you will about Scientology, at least it's an ethos" https://www.youtube.com/watch?v=J41iFYO0NQA
Yeah, apart from the fact that a lot of these things are grifts, most people are concerned about raising their families, spending time with their friends, and pursuing decent careers. If what you're saying doesn't connect to that, you're not going to win people's ear. TBH it's a problem I've sometimes faced when trying to talk about left-wing politics, but at least left-wing ideas can be phrased in ways that make it apparent how they relate to people's families and jobs and houses and otherwise everyday lives.
Also the polyamory, LSD, IQ fetishism and general elitism is offputting
I love half of those things tbh, although probably not the way they do them. Can't imagine how someone who values pure rationality over and above anything else would interact with complex relationships or entheogenic drugs.
> Some people don’t seem to have that reaction at all, and I don’t think it’s a failure of empathy or cognitive ability. Somehow it just doesn’t take. It kinda sounds like this nerd spent so much time on "x is harmful" and "x will not solve itself" that he forgot to address the "my proposal will actually solve/mitigate x" part. I'm not exactly aware of the Rationalist movement having a track record of solving these or similar problems, so no wonder it's difficult to get anybody to open their wallets.

I’ve never really understood how “rationalism” in the Yudkowskian sense is supposed to be all that different from any other ideas in the vicinity of “applying modes of critical thinking loosely inspired by science to your daily life and learning”, like Carl Sagan’s baloney detection kit or the skeptic’s toolbox or this Neil DeGrasse Tyson MasterClass. I haven’t read the full “Sequences” but I’ve seen various posts where Yudkowsky seems to claim there’s something radically novel about his brand of rationalism, like his posts on “Bayescraft” being superior to science here or the similar post here, or his post on rationalism as some kind of highly-trained mental martial art here complete with Zen-like lessons about rationalist virtues here.

Are there any ex-rationalists (or dedicated sneerers) who have actually read the full Sequences and can sum up what Yudkowsky thinks is so new about his approach? Is it really just all the emphasis on Bayes’ theorem (which non savants can’t really hope to calculate in their heads, so it amounts to little more than an analogical argument for having your credences be in grayscale and trying to be fair-minded about updating them when you get new evidence, as summed up by Julia Galef here) or is there more to it from his perspective?

I read all of the sequences - rather, I read all of the sequences that were called that when I downloaded them in December of 2011. I went back and looked at the [current list](https://www.lesswrong.com/tag/original-sequences) and I don't recognize some of them and seem to remember some that aren't there. I'm pretty sure those fourteen(!) blogposts on his coming of age(!) were not there and I know for certain that [this thing](https://www.lesswrong.com/s/oi873FWi6pHWxswSa) *definitely* was. The majority of what I remember, what stuck with me, is actually more or less the skeptic's toolbox, or what you can find (as I did) in an intro cogpsych class. There's two questions here, what does EY actually bring to the table and what does EY think he brings to the table. First: they pair the Sagan/Dennett/updated Bacon/NDT/James Randi stuff with some understanding of the many worlds interpretation, whatever is going on with Bayes - which I think you summed up accurately - and what they call [timeless decision theory](https://www.lesswrong.com/tag/timeless-decision-theory), which may be the only thing Yudkowsky has actually originated. Make what you will of [who published that paper and who cites it](https://scholar.google.com/scholar?cites=16667996803055062506). They also talk a lot about what an artificial intelligence might think ethics looks like or/and needs to be taught to be good, which totally isn't the same thing as a fridge logic discussion on an Azimov fanfic's TV Tropes page. (That's what MIRI is for, making sure that future AI is good, while [meanwhile](https://www.technologyreview.com/2020/07/17/1005396/predictive-policing-algorithms-racist-dismantled-machine-learning-bias-criminal-justice/)...) As for what he *thinks* he brings to the table, it's...well, it sounds stupid when I say it. He thinks he's a messiah and that people who think there's something off with him are pathologically dangerous (that's where our subreddit name comes from) or are evil. He is Godric Gryffindor from his own fanfic (above whom there is nothing), he is the life-giving hero with Prometheus's fire or/and a Phoenix from his own fanfic, etc. He is here to save your life and let you celebrate your x hundredth birthday with your great great grandchildren in the rings of Saturn. So what he thinks he's contributing is...salvation. [tl,dr](https://cdn.discordapp.com/attachments/342137185824407552/821600144951869450/unknown.png)

This one hurts, because the author gets so close.

We know rationalism is a good meme

Yes, and the null hypothesis is actually that that’s all it is… undergraduate-level philosophy with some alternative terminology that’s highly memetic for certain audiences.

The elephant in the room is that there is no objective evidence against this null hypothesis.

I think he actually accepts this fact in his conclusion, but in such an obtuse way that it doesn't come through: The good of rationalism is that there is a certain kind of person who feels their only outlet for their trauma, their mental illness, or their material situation is to join a cult. There's a quote from a famous researcher into addiction that goes like "For four years I was addicted to heroin, and I am so grateful for heroin, I thank God for heroin, because if there wasn't heroin in the world I would have just killed myself instead." The meme is enough. The meme sustains them.

Rationalists can’t even succeed at making sure their communities don’t become completely swamped with fascists and neo-nazis. It should not surprise us to see them failing at things that are harder than that.

they don't see this as a failure, but as a roaring success

Fun story: once upon a time, Alyssa Vance, the nominal President and Treasurer of MetaMed, walked into my house, then immediately turned to my friend and offered him 00 to clean it right then. She then later asked my friend if he was so smart, why wasn’t he as rich as she was?

It never crossed her mind that like, generational wealth and class gaps were a thing, because if you’re smart and Awake (not woke, that’s different), then money naturally finds you via Venture Capitalists.

Im so sorry, what kind of an unmitigated nugget *does that*.
Wtf
This is a bizarre story and she sounds awful but I have to wonder: how dirty was your house?
Pretty sure you didn't have to wonder that, but I'll bite -- at the time we were working class nerds renting an old house, so the best word to describe it was "cluttered".
This is why we appreciate each other
Aw Pops I love you too
Mulla Gjorki is my favourite alias now (look it up), but pop is good too

[deleted]

The psychoanalyst Karen Horney defined a number of neurotic trends, i.e. compulsive coping mechanisms developed in childhood as protection against bad/inadequate parenting. These neurotic trends can eventually take over someone's entire personality structure and are incredibly difficult to get rid of. Here's how she defines type 4a, a subtype of type 4 ("the neurotic need for power"): "4a. The neurotic need to control self and others through reason and foresight (a variety of 4 in people who are too inhibited to exert power directly and openly): 1) Belief in the omnipotence of intelligence and reason; 2) Denial of the power of emotional forces and contempt for them; 3) Extreme value placed on foresight and prediction; 4) Feelings of superiority over others related to the faculty of foresight; 5) Contempt for everything within self that lags behind the image of intellectual superiority; 6) Dread of recognizing objective limitations of the power of reason; and 7) Dread of "stupidity" and bad judgment." Thought you might find this interesting!
>Rationalism to me has always come across as being linked to anxiety. The domination of rationality in all domains speaks to a negation of the self. (This is a hot take I’m still working on fleshing out) > >As someone whose dealt with a decent share of it I can attest to how anxiety can stunt your success, stop you from pursuing your goals, etc. > >Not surprising that people who fetishise their supposed intellectual superiority are in fact irl underachievers. Hi, I think we have very similar hot takes. In fact started scribbling together note's for blog post over the last few years (I rarely get sit down and write on it, PhD takes time...). Hit me up in PMs if you want to try and co-write something, we can play critics for another
Not only do they fetishize their intellectual superiority, but look who they compare themselves to. Tech entrepreneurs who get a billion dollar valuations. (Perhaps not totally strange as a lot of the more science fiction Rationalism is based on exponential rates of growth).
[deleted]
Past a certain baseline, outside of a couple of very small fields, intellectual superiority isn't all that valuable. Also: constantly being told you're special warps your sense of perspective, relatability to peers, and work ethic.
[deleted]
Class, time management, charisma, domain knowledge, communication, empathy, looks, preparation, etiquette, focus, and especially luck.
Network, dont forget network, and access to angel investors aka family with cash.
I’d mostly lump that under luck, although charm is helpful here.
There are class clues that make it easier for someone to relate to investors with similar backgrounds, and make it easier to network with investors or people with influence on investors. But yeah, being born into a situation that lets you pick up those class clues comes down to luck.
Good breakdown, and I will echo that luck and happenstance are hugely important. For every Peter Thiel and Elon Musk, there are thousands of similar people who don't get close to their level. I am curious though about what underachievement means in the original question. Are we talking about not becoming Mark Zuckerberg or not getting a decent career going and living a comfortable life?
Not who you’re asking, but I’d say things like drive, focus, technical risk-taking, ability to attract talent, and just plain having good ideas are vital to success. I agree with the idea that intelligence is necessary but not sufficient. Oh yeah and good timing and luck too.
>>Not surprising that people who fetishise their supposed intellectual superiority are in fact irl underachievers. > >What's your explanation on why this happens? I've met tons of people who are not only supposed to be intellectually superior, but they kind of **really are** intellectually superior, relative to their close circle and the people around them. Without a doubt, they also have a high IQ. But I've noticed the same thing as you said: they are underachievers. I am not yet able to explain why. I'd be glad to hear your opinion on this phenomenon. Overthinking and not taking stupid risks removes you from the lottery of high risk-high reward, but gives you sure seats in the "struggling with depression" concert
I'm just a clueless noob here but you all seem pretty cool so here goes. (You are cool, right?) >[Praising people for their natural ability can be destructive.](http://socialpsychonline.com/2016/07/psychology-success/) Once people start to think that skills and talents are things they either have or don’t have, what happens when they experience failure? They’ll probably be devastated. They’ll think they’re not so great after all! >Instead, though, what if you praised people for their hard work? What if you told your daughter, for example, “you worked really hard for that” or your friend, “you really put the effort in today”? Studies show that focusing on the effort and determination that people showed makes them better at overcoming future obstacles. (But it’s worth noting that empty praise about someone’s “hard work” isn’t what it’s all about, as Dweck herself recently emphasized.) From what I know this isn't just some blog BS, it's pretty well supported by evidence. I'm not saying that's absolutely the most significant factor, and this may be WAY out of line but in my view it is an indication of how Freud wasn't completely wrong that emotional development can be stunted even in highly mentally developed people, and that's often a result of the emotional environment in early childhood. Ask me about my narcissistic parents.
> but you all seem pretty cool so here goes. (You are cool, right?) This a subreddit of nerds playing at being jocks. We are deeply uncool.
B-b-but that sounds just like all my friends, all 2 of them.
One of us, one of us!
What's up with your narcissistic parents?
Oh the usual. Love and affection provided on a strictly conditional basis, fixation on standard narcissistic attachment rather than encouraging genuine and honest communication...
I’m cool, in a charismatic travelling drunk kind of why, I can’t speak for the others
Start with material conditions. In western society where each generation has been richer and longer-lived than the last until recently, high intelligence did correlate with being more successful than your parents a bit stronger than lower intellegence people, but at the same time preexisting family wealth was and is a much, much better predictor of "success" than intelligence.
Plus a low-stress environment as a child enables the growth of intellectual abilities much more. The constant stresses of poverty such as having to move a lot, not having enough to eat, not seeing your parents because they work second jobs, lacking adequate school supplies, having to care for the house or siblings when you're young etc. all reduce a child's ability to relax and let their imagination wander, develop their interests and be rewarded for using their brains.
I have a very high IQ - tested by actual professionals, not some online bullshit, on multiple occasions - and I’m kind of a fuck-up The base explanation is that IQ isn’t actually that good at predicting success at all Some people (think Ted Kaczynski AKA Unabomber) just have issues or different preferences in life IQ metrics are pretty good at measuring...well IQ Extrapolating that to a sociological account of “success” - which comes with just a ridiculous number of invalid assumptions about the meaning of the word - is just straight up dumb and more importantly a boring way of doing sociology
[deleted]
What you're describing is *epistemic humility*, something that can take grown-ass adults a lifetime to acquire. The ability to work around uncertainty is a sign of intellectual maturity. The world could use more of that and less computer-groping know-it-alls, imho.
hey i just wanna tell you that you are just as valuable a person as your friends. there are utterly mundane high iq people who are boring to talk to, wrong about everything, etc., and there are low iq people who are bright, interesting, funny, right about a lot of things, etc., because *iq is just a test*. you have a unique perspective and approach that your friends apparently find valuable, and you should find it valuable too. and coming to conclusions quickly is not a sign of intelligence imo. it could just as easily be that they're jumping to what they want to believe and figuring out how to back it up later, while you're more cautious and uncertain.
I wouldn’t fuss about it at all. If anything high IQ is correlated with mental illness rather than achievement. Certainly that’s where I’m at: I’ve got a graduate degree in a highly complex field which I’ve done nothing with. If anything I just use my intelligence to shit-talk my good friends in a good-natured way. My best friend isn’t as superficially as “smart” as I am, but she’s still my best friend, and still very smart - just not necessarily on an IQ scale, being fairly average as far as I remember. The more important thing (when it comes to this kind of stuff) on my end of the friendship is the way I riff on that theme, in fact I just had a call a couple hours ago with her which was just me teasing her by interrupting to tease her with examples of jazz musicians (she’s a jazz musician) who *weren’t* fucked up after she made a casual remark about all jazz musicians being fucked up. That’s just me being a vaguely charismatic dick to my best friend, obviously, but the deeper point is that even without my obsessive attitude and learning she’s easily good enough to spar with someone with my IQ as a friend - it basically makes no difference. When I first got tested for this arena, and repeatedly afterwards, I got high scores on everything but issues associated with dyspraxia. Who gives a shit: I had to learn to be clever as much as anybody else.
I didn't say they are underachievers. I said they compared themselves only to the top. (Personally I call that the Alpha Fallacy, where you only look at the top dog and ignore the rest). (E: I don't know if they are underachievers at all, depends on who you compare it to (Also, forgot to mention, but the guy who created Ethereum has a lesswrong account, so there certainly is some level of success (even if that is burning the world, and promoting cryptocurrencies instantly undoes all the good EA is trying to do))). But yeah I agree with what others said here, just being smart isn't that valuable as there are a lot of other factors in play. Also, there simply isn't that much room at the top. And a intellectual superior POC will always have it worse than an average smart white dude who has a good network and rich upper (middle) class family support.
Raw mental horsepower, the thing that IQ tries to measure, is poorly correlated with quality of life and economic outcomes. Unfortunately it is negatively correlated with anxiety and depression. All in, you’d rather have a higher emotional intelligence than be smart. It’ll both provide better subjective quality of life (e.g. better relationships and a sense of emotional well being) and it’s correlated better with career and financial achievement (although luck really dominates there).
>Unfortunately it is negatively correlated with anxiety and depression. wait i thought it was positively correlated? like intelligence and anxiety/depression go up together?
Derp derp, I typoed that. It’s indeed *positively* correlated with anxiety and depression.
> The domination of rationality in all domains speaks to a negation of the self. I can appreciate people embracing the rational in order to transcend animal appetites but they don't seem interested in that (here's why group sex and dropping acid is rational by my evolutionary imperatives, actually).

What is the definition of success in rationalism? Being listened to?

Yudkowsky has said two things - all the low-hanging fruit in physics hasn't been discovered or implemented (HPMOR, harry talking about how much he can do with magic because he's the first person to have thought of it) - a comment marveling at how some nerds fantasize about going on an adventure or learning magic in fantasy books but are unwilling to get in on the plot IRL, i.e. do math and science (I think this was in the sequences, but i read them ten years ago and it might have been somewhere else) In other words, successful rationalism ought to look like scientific advancement. They measure skulls. I'd say that I notice I'm confused, but I'm not actually.
I'd imagine it's to become one of those psychopathic capitalists they seem to look up to. Or maybe to be top toady for one.
In that case I'd say Thiel has been a roaring success. If we're including him in the rationalist sphere.
Perhaps, although I don't know why Thiel was successful. Was it because he applied the Methods of Rationality? Because he found his success before any of that stuff was written. I've never read Thiel's book but from reviews it seems like it's less about rational thinking and more about how to build a monopoly. And at least one reviewer says it's [mostly conventional wisdom repackaged as contrarian insights.](https://www.vox.com/2014/11/30/7300019/how-peter-thiel-repackaged-conventional-wisdom-as-bold-contrarianism)
I think that having billions of dollars probably plays a non-trivial role in his success.
I'm not going to spend 30 sec looking for it, but IIRC one success story that CFAR put forward as evidence of their programme's effectiveness was how good graduates had got at ... doing rationality subculture stuff
If you're familiar with systems' theory in ecology or biology (every food web, every biological organism, every \[x\], has a certain shape to it and individual species and organs shuffle around inside the shape) the goal is basically that there's a systems' explanation for making good decisions. That is to say, there's a list of things that every good decision has in common **and rationalists stipulate that this is a sufficient list**. Rationalism is the quest for the formula of the philosopher's stone, that will make every decision that follows it a good one, no matter how stupid the idea you're trying to pull off is.

I dunno they’re taking a lot of money from Thiel and Patreon/Substack, so they’re at least successful on the monetary front.

Maybe the least hateable thing about Bay Area rationalists is their tendency to live in group houses with their friends and donate the majority of their very high salaries to charity. It's a lot less charming when the friends are abusers and Nazis and the charities are doomsday cults and eugenicists, but I hope at least some of the people involved eventually ~~join Marxbro's antifa army~~ take the lessons of having their charity manipulated and build something better from it.

Very high Joker energy.

What does Julia Galef do for a living?

She’s basically Ted Talks but as a career

as cruel as i like to be to these people, what the hell? siskind was a psychiatrist and now writes anti-SJW shit for big substack cash. yudkoswky didn’t go to college and has a highly paid job smoking weed, sucking peter thiel’s toes and saying “but what if we could make ourselves smarter maaaaaan”. aaronson’s a professor. like…maybe moldbug is who they’re talking about? but he has a huge following in the tech sector.
overall they seem reasonably successful. what even is this person rambling about.

I suspect that the measure of 'success' being used there goes beyond just money. I imagine the author may be referring to some of the stereotypical trappings of the 'good life'. This could be: a happy, committed romantic relationship; children; a home with pleasant neighbors; hobbies and the time to enjoy them; or health. Some of these dudes look and write like they came straight out of a weird laboratory with no girls or sunshine allowed.
they seem pretty satisfied with their situation, at least from the outside. i mean, we all suffer deep anomie as a result of the pain of existence but surely that's not what they're asking about. idk, i sneer in part because they all seem so smug, if they were actually forlorn then i might not be so enthused about it

Might have something to do with how rambling and obtuse your post is

i hope the 'your' is directed at the author of that post and not at me because i didn't write that shit lol
Oh yeah totally, I’m not stupid enough to assume it’s yours

It seems success is too irrational for them.

I actually like Applied Divinity Studies - they had a fantastic takedown of Lambda/Austen Allred a while back

Why don’t these folks openly advocate for sortition and actual technocracy (the plebes are irrational, policy should be run through formulas to Maximize Utility) and the replacement of the USGOV by a Technate?

MENSA syndrome, too. But I like a lot of “rationalists” and MENSAns.

Do you work in Venture Capital? I need a reviewer for a long upcoming post. Email me for details.

Please mister money ghoul sir UwU

Winning? Notice that rationalists cannot even choose a good game in the first place.

coming out of my cave at 34 and becoming a millionaire within a year..im moving to Berlin right now

Charlie Munger

charles amongus/amogus
when the imposter is sus!