r/SneerClub archives
newest
bestest
longest
The Future Perfect 50: Julia Galef, co-founder, Center for Applied Rationalist (https://www.reddit.com/r/SneerClub/comments/y8w601/the_future_perfect_50_julia_galef_cofounder/)
43

https://www.vox.com/future-perfect/23391917/future-perfect-50-julia-galef-cfar-rationalist

in which a reporter who lEaRneD TO ApplY BaYES’ THeorEM at least as far back as 2012 manages to get a mainstream publication to promote one of our usual suspects as if she’s just an interesting Serious Thinker

[deleted]

I think so, too, although Vox's *Future Perfect* articles are explicitly vehicles for this stuff from what I can tell. They were big fans of that Democratic primary candidate entirely funded by Sam Bankman-Fried, whose platform was based on making Effective Altruism palatable to voters.
it is not just you.
It is 'interesting' to see how the Twitter user xriskology, who is successfully exposing a lot of the bad stuff from EA and longtermism is now being attacked and harassed. So in a way stuff seems to be ramping up.
A dear friend recommended Ezra Klein’s column and podcast to me a couple of years ago: to my surprise, he has brought on a huge number of Rationalist-adjacent folks. Julia Galef hosted the podcast while he was on parental leave and asked Philip Tetlock fawning questions about superforecasting. It’s disconcerting.

I know it has overlap with the effective altruism people but I actually like future perfect. Second season does a great breakdown on the problem with billionaire philanthropy.

I mean you don’t need to go as far as links to EA, this article is about Julia Galef herself and *explicitly* draws attention to LessWrong
True. I love snarking on less wrong rationalists but I actually have mixed feelings in some regards. Less wrong did help me rethink some key approaches to thinking. But that said the actual results of the community at large has mostly been silicon valley pseuds demonstrating their complete lack of education in the humanities.
I think most of us ended up here because we stumbled on some rationalist-adjacent community and liked it at first. Maybe even got a lot out of it, before eventually realizing it was a big-brain circle jerk
[deleted]
Huh?
You also seem to vaguely be under the impression SneerClub is about EAs with a side order of LessWrong? I could be misreading there
Nah. I thought sneerclub was rationalists and ea and less wrong both fall under that umbrella.
I don’t think that the actual results of “the community at large” are as mild as that, but you’re stating it as if it’s obvious
I don't think I said it was mild? Maybe I made an understatement but I'm a little confused by the strong reaction I'm getting here from you. Feels like I've accidentally communicated something I did not intend.
[deleted]
I think you might have assumed more than necessary. I made no statements about your beliefs.
some of its stuff is fine. that's why the rationalism laundering is so disturbing

Can’t be a Thought Leader^^^^tm without being a Vox reader

[deleted]

[deleted]
[deleted]
Didn’t CFAR have an explicit bent toward funneling people into AI safety research? And the “rationality camps” seemed a lot weirder than basic Kahneman and her book stuff. First hand tales from there are extremely odd. She clearly knows how to make things sound reasonable to normie audiences but she was deep in that culture for a long time
I don't know whether that was the initial goal, but that has been quite explicitely the goal for the last 5 years.
"wow this gateway drug sure goes down easy"
that's like calling intermittent fasting a gateway drug to anorexia.
Which is a thing that can happen yes. People actually have warned people for that.
yeah this is accidentally a halfway decent analogy but not in the way they thought lol
Yeah was thinking about it and it is actually a good analogy, as intermittent fasting can trigger eating disorders if you are already susceptible to them. In the same way that being into LW can open you up to various issues if you are susceptible to them. See how some people go 'lol the basilisk, that is silly' after reading Rokos basilisk story, and how it fucks other people up mentally (which also means that in a way you can't 100% fault Yud for taking it as a serious threat, of course the reasons why he took it as a serious threat is because he thought it was serious because he believes his own TDT bullshit (which he later denied of course)). E: I focus here only on the basilisk story of course, but it is one of the many memetic threats posed to people from the cult incubator Lesswrong. (like putting families at risk because one of them is convinced they should donate more money to MIRI, and their partner disagrees). A reason why I always like the SCP foundations stance to not roleplay that shit on the forums and that is just creative fiction writing.
I'm on ya'lls side and am genuinely worried about how many people on this subreddit think biting the bullet on this analogy is the correct response, instead of... you know... pointing out that it's a disanalogy for a gazillion potential reasons. The fact that people with pre-existing mental illness or eating disorders might react to intermittent fasting by becoming anorexic is not an indictment of the health benefits of intermittent fasting... it's an argument about why intermittent fasting is perhaps not for everybody. Or it's an argument about why we should be vigilant and concerned when someone appears to be anorexic but is insisting they're just intermittently fasting (e.g. when the basilisk rears its ugly head). But that's not what is happening in this subreddit. Both the IF'ers and the anorexics are getting sneered at with equal vitriol and the lack of distinction makes me feel a little uneasy (because I don't think good people with good intentions should be ridiculed). But if you don't like the IF analogy, here are some other bullets you can bite: * Vaping is a gateway drug to meth * Eating meat is a gateway drug to psychopathy * Saturday morning cartoons are a gateway drug to hentai --- In any case, I do think it would be an interesting and useful project to see if the ideas taught by the rationality community are uniquely 'triggering' to people with pre-existing vulnerability to basilisk-like concern. I've seen a ton of posts on the LW and SSC subreddits about people having genuine panic attacks about the prospect of X-risk, or having decision paralysis because of a felt need to apply Bayes-style reasoning to every decision (e.g. dating). It's unclear if these people would just be having panic attacks about something else if it wasn't X-risk, but it would be useful to quantify the gateway-druggedness of rationality ideas in a empirical way. Let me know if you got any ideas.
The diff between if and lw is that with if they warn against you getting an ed, or at least dont support it actively. If also has slighlty fewer crypto fascists.
genuinely curious to hear what you think "biting the bullet" means
I meant accepting the analogy, despite the obvious differences between the two things
OK, I get the sense that you might not be a native English speaker, but FYI "biting the bullet" means "accepting the painful **truth** of a situation" and thus is probably not the idiom you want to use when you mean "accepting something that I personally believe to be false" anyway, there are several premises here I don't accept: * the purported benefits of gateway rationalism-lite are analogous to the purported benefits of IF * things that make you personally uneasy are therefore ipso facto bad * she is a good person with good intentions your entire last paragraph sounds like a navel-gazing waste of time to me and I doubt you would want to pay my going rate for "consulting on shit I have absolutely no interest in"
If you forget about the analogy, you can sit back and see with remarkable clarity that the rationalist “community”, such as it is, consists of a highly networked group of people who take LessWrong seriously That itself is an indictment of the whole space
Confirm
More often orthorexia, I think, but yes, IF is in fact very definitely a gateway to eating disorders.
which it is
Another person recreating object/meta level like distinctions is pretty sneerworthy imho. Ribbonfarm already innovated it into 2d grids ages ago.
>the least sneerworthy person in She wrote the The Scout Mindset - a binary classification to reifify Elon Musk, wow. She thought people how to do Bayes math 'in the head' which is insane. She podcasts in favor of Uber. [https://www.reddit.com/r/SneerClub/comments/nmwv6v/comment/gzs7lo6/?utm\_source=reddit&utm\_medium=web2x&context=3](https://www.reddit.com/r/SneerClub/comments/nmwv6v/comment/gzs7lo6/?utm_source=reddit&utm_medium=web2x&context=3)
Wait, all those things are mentioned in the article lmao
She's actually a GOOD cult leader, guys!
Wrong.
She actively promotes and creates the material for the dumb bullshit, "entirely unobjectionable" is nonsense
[deleted]
no, but do feel free not to expect everyone else here to do the reading for you
Agreed. This sub doesn’t reliably differentiate between the absurd, stupid and racist EA, rationality people and the much more respectable thinkers like Galef, who is, at worst, maybe not everyone’s cup of tea.
[deleted]
I hope to one day be so publicly annoying to have an entire subreddit spinoff.
/r/Sneersneerclub exists
Isn’t she like a TERF or something?
Is she? I see that Richard Dawkins [brought](https://twitter.com/Artistphilb/status/1386372387499876358) her up after being [transphobic](https://twitter.com/RichardDawkins/status/1380812852055973888), and she has one (1) bad [tweet](https://twitter.com/juliagalef/status/904449450797547520), but I don't think that's clear enough to be a smoking gun.