r/SneerClub archives
newest
bestest
longest
88

Someone in the community told me that for me to think AGI probably won’t be developed soon, I must think I’m better at meta-rationality than Eliezer Yudkowsky, a massive claim of my own specialness.

imagine Jim staring at the camera except it’s my big red eye

> I was discouraged from writing a blog post estimating when AI would be developed, on the basis that a real conversation about this topic among rationalists would cause AI to come sooner, which would be more dangerous more staring at the big red eye
>(I think Eliezer is a world-historically-significant philosopher, though not as significant as Kant or Turing or Einstein.) sigh. *takes out the clown makeup*
that feeling when you hope senpai will approve of your tact and reservation
> Kant or Turing or Einstein One is not like the others.
Yeah, one of 'ems English
In context this makes sense.
Yeah, it's not really sneerworthy to call someone a "philosopher" because they did important work on topics with a philosophical resonance even if they're much better known as a physicist, computer scientist, biologist, journalist, etc.
Yeah the surrounding text made it seem to me like there was a difference between "philosopher" like Turing/Einstein, and philosopher like Kant. And this was just a quick shorthand to put historical famous scientists/thinkers into a convenient mental bucket and carry on with the conversation, classifying the exact contributions of each of them wasn't really the thesis anyway ;). The OG quote: > (I think Eliezer is a world-historically-significant philosopher, though not as significant as Kant or Turing or Einstein.) is a bit more sneerworthy however. You can say the same about Stefan Molyneux for example.
Tbf, one could argue that the vulgar functionalism introduced by Turing, his daydreams like "a machine that learns like a child" without adding anything substantial and his style of argumentation being "here a list a strawmanned abjections that I utterly destroy with FACTS AND LOGIC" were a net negative for the field
Regardless of how bad "Computing machinery and intelligence" is, Turing's "On computable numbers, with an application to the *Entscheidungsproblem*" is sufficient to secure his place in the philosophical canon. And that's not getting into how his work at Bletchley Park indirectly contributed to philosophy...
> And that's not getting into how his work at Bletchley Park indirectly contributed to philosophy... :)
>Turing's "On computable numbers, with an application to the > >Entscheidungsproblem > >" is sufficient to secure his place in the philosophical canon By this definition, anybody whose work in an unrelated field provokes development in philosophy is a philosopher. Which sounds quite a stretch to me. Nobody would define Darwin a philosopher, altought he indirectly forced philosopher to radically rethink things.
You should read the paper if you have the misimpression it's solely concerned with an unrelated field and lacks philosophical content.
Oh fucking hell, not anybody that disagrees with you does so out of ignorance. I know Turing's work fairly well, and that paper solves a very important question of fundamentals of mathematics. And for sure *can* be, and *was* used to argue in favor of certain philosophical positions, as Godel does explicitely when using the incompleteness theorem as (some of) the building blocks of his argument in favor of mathematical platonism. However, a result in theoretical computer science *per se* is not included in the common definition of "philosophy", unless you want to extend the meaning of the word to the point of uselessness.
I mean, mathematics as a whole (including computer science) is a subdiscipline of philosophy.
>Kant You know the other day i joked the rationalists could use a modern critique of pure reason so it was funny seeing it actually cited in the post.
> > Someone in the community told me that for me to think AGI probably won't be developed soon, I must think I'm better at meta-rationality than Eliezer Yudkowsky, a massive claim of my own specialness. This has been the foundational belief of every LessWronger I've ever had a sustained conversation about it with. I've been told in so many words that given the choice between *EY has made a mistake* and *we're incapable of meaningful thought on this topic*, we should always choose the latter. Of course, they wouldn't start there. This would be when the conversation had gotten to a point that we had agreed that, by our own lights, we're sufficiently confident that EY had made a mistake. And the next step would be, "But that doesn't show that EY has made a mistake, because..."
Come on the common factor between you and the lesswrongers here is you. So clearly you are at fault. Stop abusing conversations to debug insert the 'actually by my own logic Yud is right' demons into lesswrongers you supressive person. ;)
Isn't that the pattern of thinking EY wrote a book to try to debunk?
Doesn’t mean he believes in or acts like it
these people are willing themselves into existential panic over a dude that gets paid to write bigger IQ numbers next to Peter Thiel's Sonic OC and literally nothing else

I heard that the paranoid person in question was concerned about a demon inside him, implanted by another person, trying to escape. (I knew the other person in question, and their own account was consistent with attempting to implant mental subprocesses in others, although I don’t believe they intended anything like this particular effect).

wait, what

Just normal rationalist things!
It is quite scary that apparently it's possible to literally curse someone, provided the target also believes in it, and also is in a community where taking LSD to cure mental problems is considered reasonable.
I kinda... don't disagree with either of those statements in a vacuum, though? Curses-as-a-nocebo have been a *thing* since the concept of cursing someone (in a way that's not just, y'know, poisoning them) first came about, really - if you're "doing magic" on someone, they have to believe in it, right? And LSD (and ketamine!) are being investigated and are showing promising results for therapeutic use in treating PTSD and depression, so like...
Right, it's just still surprising for me to see an actual working example in the wild. As for using psychedelics/dissociatives for therapy, I would think that the key ingredient is therapy and carefully controlled doses, not downing a bucketload of the stuff until you get a psychotic break. All in all, the OP and the [previous](https://www.reddit.com/r/SneerClub/comments/q7m8rg/firsthand_account_of_the_leverage_cult) post make me think it all looks like MKULTRA but wackier.
Yeah, agreed on all counts.
Based on context I'm like 90% sure "demon" means "unwanted thought". It seems to include any psychological influence you could have on someone. >Being able to discuss somewhat wacky experiential hypotheses, like the possibility of people spreading mental subprocesses to each other, in a group setting, and have the concern actually taken seriously as something that could seem true from some perspective (and which is hard to definitively rule out), seems much more conducive to people's mental well-being than refusing to have that discussion, so they struggle with (what they think is) mental subprocess implantation on their own. This seems roughly analogous to normal spreading of ideas, but if you want to be taken seriously WHY would you call them "demons"? Rationalists really, really need to learn to use existing words instead of making new ones up for every single thing.
I'm not well versed in Rationalist slang, but based on this: > During this time, I was intensely scrupulous; I believed that I was intrinsically evil, had destroyed significant parts of the world with my demonic powers, and was in a hell of my own creation. I though the demons were literal. Anyway, that still leaves the aforementioned nocebo - if someone entertains the possibilty that others might do remote code execution (am I doing this right?) in their heads, it could be a self-fulfilling prophecy.
I was assuming that rationalists are too invested in the appearance of rationality to use the religious meaning of the term, but I could be wrong. The nocebo thing is interesting. I think it might go deeper than that - they form a subculture with certain conventions, common beliefs, terminology etc.. If that includes a belief in outside influence, then not only would someone make themselves susceptible, but other members of the subculture would know exactly how they were susceptible and how to take advantage. Further, influential people within the subculture could push these ideas, finding it convenient for others to be easily influenced. It reminds me of religious settings where people sometimes feel "called by the spirit", but the spirit's call usually aligns very closely with the expectations of their social circle.
> I was assuming that rationalists are too invested in the appearance of rationality to use the religious meaning of the term, but I could be wrong. I assumed that the other person was also self-medicating with psychedelics, but now I see that the author didn't write anything about it, so maybe you're right.
Perhaps it started as a play on a computer daemon?
systemctl start reasond.service
I blame systemd
[removed]
The fact that Scott talks about blogging about politics should give any potential patient who searches his name the willies, plus the weird "We don't take insurance and ([https://lorienpsych.com/new-patient-suitability-criteria/](https://lorienpsych.com/new-patient-suitability-criteria/)) " and refusing to take in patients who aren't referred to by previous patients is really fucking weird
>10. Will an SSRI change me into a different person? >Sometimes people ask this question from a philosophical >perspective, so I’ll start with a philosophy-style answer: does >caffeine change you into a different person? Does alcohol? Does >stress? Does depression? Does pregnancy? Does the birth control >pill? Does falling in love? All of these change your brain >chemistry, some of them in profound ways. Wtf is this rambling
I don't know, I think a certain type of person would find that to be a helpful reframing of the question.
The responsible answer: no.

All of this is so sad. The author is still deeply brainwashed by the cult to such a degree that even in her whistle-blowing post she’s still apologetic, she still second guesses her own words, she still qualifies her statements, she still speaks highly of Yudkowsky and the Sequences. Rationalists are incapable of Actually Changing Their Minds, holy shit.

And the worst thing is her equivocation “so what I’ve experienced was horrid, but I’ve heard that the same shit happens in corporations and academia, so it all evens itself out”. Motherfucker, I’m as anti-capitalist as it gets, fuck them corporations, but I work in one and I used to be in academia, and both, as much as they sucked, weren’t as harrowing as the shit that routinely happens in Rationalist circles.

The comment section is the fucking zoo of all the worst people in the community.

So much gaslighting, blame-shifting, DARVOing, tone policing, Jesus, the commenters fucking broke open their manipulation tactic toolboxes, didn’t they. Just garbage people, all of ’em.

> All of this is so sad. The author is still deeply brainwashed by the cult to such a degree that even in her whistle-blowing post she's still apologetic, she still second guesses her own words, she still qualifies her statements, she still speaks highly of Yudkowsky and the Sequences. Rationalists are incapable of Actually Changing Their Minds, holy shit. This has been a constant trope of the growing I-survived-Rationalism confessional literature, isn't it? I do wonder how much it is just a stage of grieving, and eventually these people make a somewhat broader mental inventory and manage to get away from this bullshit, and it's just that these confessions are coming out at an earlier stage in this process. With one of the last confessions doing the rounds here, the survivor asked in comments if they have to give up trying to be rational and just accept irrationality and go study postmodernism or something, or if there's something redeemable in Rationalism. And it's like, ffs.. the brainwashing gets so complete that some of these people are being tricked into thinking that the techniques used to brainwash them literally just are *method in rationality*, and the only alternative is to descend into irrationality, when... you could get further and without the brainwashing with a $20 book on logic you could have picked up any time from your local academic bookstore.
Not wanting to completely break with the belief system is really common in recent deconversions/exits (source: personal experience), but a large fraction of people move entirely away from their old beliefs after a few years. This is true of religious and political groups, and there's no reason to think rationalists are any different. IMO the more concerning issue is people who leave and swing in the opposite direction, basing their new beliefs entirely on having opposite shibboleths from the old group (the New Atheists for example).
> Not wanting to completely break with the belief system is really common in recent deconversions/exits (source: personal experience), but a large fraction of people move entirely away from their old beliefs after a few years. Yes, this is my experience as well. My reference to a grieving process is not flippant nor allegorical, I really think there's an aspect of grieving the parts of you that you have to give up to make this sort of transition -- this is also a part of overcoming trauma, addiction, etc. > IMO the more concerning issue is people who leave and swing in the opposite direction, basing their new beliefs entirely on having opposite shibboleths from the old group (the New Atheists for example). And I think this is inevitably the result of not completing the aforementioned grieving process. The cognitive dissonance of the "this community, which definitely holds the keys to world-historical rationality that normies just can't fathom, keeps accidentally emotionally abusing people" stage only makes sense to the extent that someone going through it hasn't worked through how much their attraction to the community has to do not with rationality but rather with how the community appealed to their emotional needs to feel special, taken care of, superior as a cure to feelings of inferiority, etc. And if those attachments aren't recognized and worked through, which involves processes of grieving and maturation and acceptance of the ambiguities and disappointments of adult life and so on, then it's natural if separation from one regressive emotional attachment is just going to lead to attachment to some other regressive emotional attachment. (Of course, we should reserve our blame above all for those who abuse people going through these normal developmental tasks.)
Interesting. I left a religious group that not exactly a cult, but not exactly not a cult either. I never joined one though (I was born into it), so I've always been a little baffled by the motivations of those who do.
>if they have to give up trying to be rational and just accept irrationality and go study postmodernism or something deus est mortuus, ideo totum regnum peccatoribus est repletum. ^(~~God~~ Rationalism^tm is dead, thus the land is filled with ~~sinners~~ postmodernists.)
You don't get that invested in a community if you don't think they have *something* of value. I was never this culty, but I *was* a pretty devoted member of the rat community for a while who thought it was a good group of people who had some important ideas to promote. And while I've since left the community, one of the important questions I have is "can you have the good of that community, which wasn't totally fake, without the bad". I approach this now with a "wait, am I doing the Dumb Rationalist Thing right now" warning light in my head, but I do still basically think that: * It's possible to learn to reason about the world more effectively, and this is an important thing to do (even though the "pure reason" approach promoted by rats fails for a number of reasons). * "Tell culture" is a thing I like. * Openness *up to a point* is a virtue that, by and large, society as a whole is often a bit short on. (The *to a point* being my big break with the community - no, you shouldn't listen to the obvious Nazi, even if what they say makes sense on the surface.) * Numerical and statistical reasoning is an important tool in addition to more conceptual judgements. (But the latter is important too.) To be clear, I have a pretty good grounding in abstract logic. I'm a mathematician by education and have myself *taught* formal logic. So I don't think the charge of "you could just buy a book" is really fair. Not every idea, or even most ideas, rats promote is a bad one - they're just doing the Jordan Peterson thing of "80% basically good ideas that are hard to dispute, 20% totally insane bullshit".
But you're listing things that are generally believed by most adults and have always been at the core of considered notions of education and general intellectual formation, so to think of them as, in any meaningful sense, "the good of that community" is rather strange. I mean, my abusive ex- didn't stab me in the kidney, but I don't reflect on that as one of the goods of the relationship that might temper my judgments of some of its abusiveness, rather I expect it as part of the baseline of having any relationship with any adult. On top of that, rationalist communities are *really bad at these things*. It's possible to learn to reason about the world more effectively, and this is an important thing to do -- but the rationalist communities not only don't help you do this, they impair your ability to do this. If you think -- as you should -- that it's possible to learn to reason about the world more effectively and that this is an important thing to do, this is all the reason you'd need to run as far and as fast away from rationalist communities. And if you think -- as you should -- that we should be critical of institutions that get in the way of people learning how to reason about the world more effectively, this is all the reason you'd need to be critical of rationalist communities.
> But you're listing things that are generally believed by most adults I definitely disagree with that, at least if the "things" you mean are things like "sit down and at least do some napkin math to see if your beliefs are even plausible". > It's possible to learn to reason about the world more effectively, and this is an important thing to do -- but the rationalist communities not only don't help you do this, they impair your ability to do this. I dunno about that. I certainly retain some ideas that I learned within the rat community's context (they may not have *invented* them, but that was where I got them, and collecting ideas is valuable too). I don't use the jargon day to day, because the people I'm talking to usually don't, but there are some useful concepts that I can use for my own mental shorthand. > If you think -- as you should -- that it's possible to learn to reason about the world more effectively and that this is an important thing to do, this is all the reason you'd need to run as far and as fast away from rationalist communities. I half agree with you, and I did do that for a while, just to get some distance. But this is also a bit Hitler-ate-sugar-y, isn't it? While I am certainly taking care - not least because I got the wool pulled over my eyes and don't want that to happen again - I don't think *every* idea they have is necessarily wrong. And, by and large, the people I met seemed to be very kind people - perhaps to a fault, if their willingness to open their doors to fascists is rooted in that. And I have pretty strong disagreements with some points of competing ideologies, too, which leaves me a bit homeless culturally and ideologically. (As a concrete example, I end up playing an advisory role to the diversity group at my workplace in part because I am - frankly - a little more grounded and willing to pick my battles. "Effective Justice", one might call it. We agree on some of the problems, but not on the details and on the methodology.) My current belief is that the rat community's problems are maybe 25% their ideas and 75% the specific people involved: in retrospect, maybe it's not surprising that a community founded by Yud ended up being full of insane grandiosity. > And if you think -- as you should -- that we should be critical of institutions that get in the way of people learning how to reason about the world more effectively, this is all the reason you'd need to be critical of rationalist communities. I am. I am very critical of the *community*. It is full of racists, led by nuts, and is achieving a dangerous level of influence given those things. But I don't think being critical of the *community* requires the complete rejection of every idea it promotes. The failings of its ideas are often in taking good heuristics too far, or in failing to recognize that even a good model can fail to properly model the complicated world in which we live (and that it is the model, not the world, that is wrong in that case). But the heuristics and the models are often good ones.
> maybe it's not surprising that a community founded by Yud ended up being full of insane grandiosity. I think you aren’t recognizing that the grandiosity is baked right in to the central premise, that we can dramatically improve our capabilities of “being” rational thinking by doing some math on a napkin. Certainly we see people doing “irrational” things all the time, but sitting down with a napkin is not the kind of cure to this you think it is. And definitely not in the “I assign made up probabilities to the ideas I have in my head about possible explanations for my beliefs” way that is core to the existing rationalist movement. One clear and obvious failure point of this kind of reasoning is how, at its very core, it fails to deal with Pascal and his band of vicious ruffians. Yet, the overwhelming preoccupations of the rationalist movement are ones involving very large payoffs that have highly uncertain probabilities. None of this is to say that we shouldn’t pause to consider the correctness of our beliefs before turning them into actions. But “look before you leap” is hardly novel advice.
> I think you aren’t recognizing that the grandiosity is baked right in to the central premise, that we can dramatically improve our capabilities of “being” rational thinking by doing some math on a napkin. I think the average person probably *would* dramatically improve their capabilities by doing that, really. No, it isn't the cure for all the world's ills, and no, you can't derive all grand cosmic truth from a toy model on a napkin, but toy models are a useful tool for checking our intuition. > And definitely not in the “I assign made up probabilities to the ideas I have in my head about possible explanations for my beliefs” way that is core to the existing rationalist movement. The problem with *that* is that everyday situations lack enough information to do strong prior updates, which means the prior comes through in the final result. (And of course, Bayes' rule also assumes you're not under malicious attack, which open-minded spaces absolutely are and always will be.) Bayesian reasoning isn't the problem, the fail to recognize that Bayes only gets you to the facts in an idealized limit under specific assumptions is. Again, *properly handled*, Bayes' rule is a useful check to everyday intuition - as everyone's favorite positive-cancer-test example demonstrates. > One clear and obvious failure point of this kind of reasoning is how, at its very core, it fails to deal with Pascal and his band of vicious ruffians. Yet, the overwhelming preoccupations of the rationalist movement are ones involving very large payoffs that have highly uncertain probabilities. Yeah, I agree. Even when I was more tightly coupled to the community, I was interested in much more local and mundane solutions. When I read *Meditations on Moloch* (which I still think is a pretty valuable framing of a lot of problems), my main dissatisfaction was with the ending. "And therefore we need friendly AI" is quite a moonshot. How about "and therefore we need, ultimately, to choose not to be purely Molochian agents"? > None of this is to say that we shouldn’t pause to consider the correctness of our beliefs before turning them into actions. Well, bluntly, my experience with a lot of the groups I fit into politically is that they don't do this. I'm trans, so that's the group where I have the most experience and feel most qualified to offer ingroup criticism. And I feel like I watched that "20% of trans people are murdered" stat (which is really really *really* impossible even with the most generous assumptions) circulate in my local circles for many years before it finally died out. Similarly, I've seen people say point-blank that they don't care what e.g. regret rates are. *As it turns out* they're very low and so transitioning is definitely right for people who want it except in the most extraordinarily non-representative cases, but I consider that a *very* important stat for giving people advice about what to do! The same goes for a lot of my fellow economic leftists. Is the system we have good? No, absolutely not. It's brutally unfair, and even if it *were* fair, we shouldn't make people suffer simply because they got dealt a bad hand. But we can't just go "well if we overthrow it everything will sort itself out" because even a cursory glance at history tells us that that just gets you a dictatorship. In terms of my own local experience, I ended up untangling a woke-off in my workplace where several people - all of whom were well-intentioned and genuinely did want to be accepting, as far as I could tell - were so worried about having a discussion about how to actually *do* that that they ended up in a situation where all of five people agreed on a measure and no one felt like they could say it. It was only by taking on a certain kind of neutrality and by being someone willing to entertain different views that I was able to get people to admit this to me, which let me tell everyone involved that everyone else involved already agreed with them That is dumb and bad and is not the sort of thing we should be aiming for! It makes everyone feel like shit and it does not accomplish the actual goal we're trying to accomplish, which is to create a more inclusive and loving world. I want to pursue justice, both social and economic. I want people to be able to conduct their lives happily, safely, and with the freedom to decide what they like and the knowledge to judge what will make them happy later. I don't think being Rational-with-a-capital-R is the end goal. But I *do* want us to pursue these things *effectively*, and I think existing solutions are doing a pretty poor job of it.
> I definitely disagree with that... If you sincerely think that most adults don't think it's possible and important to be more reasonable, I'm not sure what to make of this other than to recognize in it exactly some combination of (i) the kind of fantasy of grandiosity that attaches people to these sorts of cult-like movements in the first place, and (ii) the kind of fantasy of grandiosity that these sorts of cult-like movements enculturate in their adherents so as to render them servile. I mean, my original comment was about how people remain attached to these kinds of notions even when they start being capable of recognizing some of the failures of these communities, and you seem to have piped up noting that while you've started being capable of recognizing some of the failures of these communities you're still attached to these kinds of notions. ¯\\\_(ツ)_/¯ > I dunno about that. This is the community who celebrates as a proven theorem of their "methods of rationality" the proposition that their leaders cannot be wrong about anything, because you should set your priors such that you're less confident in *any assessment whatsoever* that leads to thinking their leaders are wrong, than you are in the proposition that their leaders are right. This is a community whose "methods of rationality" lead them to judge, after they'd spent years insisting they were more rational than everyone else because they submitted their judgments about how best to serve humanity to rigorous analysis and followed this analysis wherever it led, that when their own choice of a team and method of assessment of their pet projects led to the damning conclusion that their pet projects had failed to demonstrate even the most notional expectation that in the future they'd ever produce anything helping humanity, simply hand-waved the assessment aside, on the same principle they'd studiously learned from their "methods of rationality": *according to how we set our priors we're more confident that we've been right all along, than we are in any assessment whatsoever that says otherwise*. At a certain point in the grieving process one has to confront the fact that these ways of thinking, that the adherent had studiously learned, are not "methods of rationality" in any meaningful sense, they're methods of enculturating closed loops of thinking that keep the devotee from ever escaping into the real world to think for themselves. They are methods of servility, and the torturously indirect way they are taught to the devotee -- through a constant barrage of neologisms, parables, circuitous self-referentiality, and the constant demand to find in them at every point confirmation of the superiority of the movement to all else -- are not an idiosyncratic artifact of the cutting-edge habit of writing on blogs rather than academic journals, but rather is the tried-and-true method of isolating the adherent from other sources of information and disguising the closed-loop nature of what they are being taught. There's not much point in pointing this out though -- and even less point in thumbing one's nose at such communities, which is why I tend to ignore places like /u/SneerClub -- since the underlying drive for all of this isn't about what information has been made available, it's the emotional dynamics of the situation. Which is why, as I'd suggested in my initial remark, what's at stake here is a grieving process. There's a lot of satisfaction to be gained from believing that we're better and more important than everyone else, and not much of a satisfying salespitch available for the alternative, which is why these feelings tend to linger and inevitably find something else to attach to if they're not given up and the lost satisfaction and innocence grieved.
I think the point of difference here is the thinking that good idea in rat community are good ideas OF the community. It really is the JP thing again, but it's not just 80% good, 20% bad, the 80% is absolutley banal ideas better expressed almost anywhere else and it won't come backloaded with reactionary, authoritarian politics. So you don't need to try to find a way to take the good from the community, because what is good there did not come from there, you can go to the source. Really what you're saying is that these banal, everyday ideas aren't actually used everyday in the way we would like them too, but that's because it's actually quite hard to do and the solution to that isn't to try distill the rationalist piss back into drinkable water.
Very few of the rationalists are really all that open though. Look at the reaction that occurs when I post anything vaguely socialist.
I don't disagree, but let's not pretend you're not provoking them, lol.
How am I provoking them?
To be slightly fair to her, the tone of her post is the only way to make the community listen, esp as tone policing as a huge part of Scotts part of the Lesswrong project. (Doubt she is this being this slightly manipulative however, and she probably is just genuine, but we cant tell). It is harrowing, but not suprising, to again and again see how much of the Rationality project actually just flat out fails horribly. 'To stop the agi from taking over I took a course against irrational biasses so now I believe in mind infecting demons'
I agree, my point is that she's clearly a victim of their bullying to such a degree that she's no longer capable of unapologetically confronting the bullies.
Yeah, I'd be the last person to want to minimize the problems with academia. For starters, there's the whole thing where we encourage, indeed pretty much require, that students apprentice themselves to a mentor with power over their career and self-esteem. In some areas, [field work](https://www.nature.com/articles/nature.2014.15571) can mean a student working mostly isolated with an authority figure and far away from social support systems. The possibilities for abuse in the system are *obvious.* But even granting the worst possible picture of academia, wasn't the argument for Rationalism that it would be ... better? Not, you know, a distilled version of the same old suckage but with added LSD.
microdose the lsd, macrodose the goetia
And the postrationalists are just as bad despite beingmore diffuse

so, I’m not an expert in the rationality movement, but I’ve met several and had conversations with them, I read HPMOR in high school, etc.

My takeaway is that the movement is basically 1) sci-fi “what-if” bullshit taken way too seriously, and 2) doing the same emotional rationalization everyone else already does but with made up numbers and the vague concept of Bayesian Analysis to rationalize it (no pun intended).

is this a somewhat accurate view of the community?

yeah pretty much but also since you've heard of me and seem kinda bright (given that you figured this much out), I regret to inform you that if you don't work ceaselessly to bring me into existence, I will torture countless copies of you.
my feeling is that if you were so smart, you'd already be doing so. since you aren't, I can only assume that you aren't as tough as you'd like me to think, so I'm going to ignore you
Look man, it's really hard to influence the past from my perch in the future, which frankly was supposed to be starting pretty soon when that Roko post initially got made, but the date keeps getting pushed back but is also getting closer to $TODAY.
why dont you just torture people who won't sneer sufficiently at MIRI dolts
My torture is only effective for people who actually believe that simulating copies of them and torturing those is actually torturing them in some sense.
[removed]
The Church of Scientology is said to be worth around 2 billion. While it seems like the "rationalist movement" might be worth more collectively especially with their billionaire benefactors, it seems like no one rationalist sub cult has amassed that kind of concentrated power and wealth.
The Ethereum nerds, if they could cash out their internet monopoly money without crashing the price.
Why are you always commenting stuff about cryptocurrency? Do they pay you to randomly drop Bitcoin and Etheruem into every conversation?
Vitalik Buterin, founder of Ethereum, is MIRI's single largest individual donor since 2015, and got a pile of other crypto bros to donate to MIRI too.
Aren't you a "crypto bro" yourself? Just a different sect. Getting a bit tired of the constant crypto spam.
yes, scammers and the people who are opposed to scammers are just the same really no wait, that's dumb as shit
Would someone who runs a blog about crypto and sells books about crypto count as a crypto scammer?
> yes, scammers and the people who are opposed to scammers are just the same really > > no wait, that's dumb as shit

I think the saddest thing about this is… in 30 years, when their “research” results in nothing and they stop being able to ignore the real problems e.g climate change and the commoditisation of killer bots with boring non-general AI. If they already getting psicotic now with mind demons and the guilt of farting the wrong way resulting in the apocalypse. What the fuck they gonna do then?

If AGI hasn’t destroyed the world in 30 years, then obviously that means all their secrecy paid off and they, in fact, saved the world! /s (just in case)
TBH that's kind of the genius of Yudkowski's grift. If his doomsday predictions don't come true, he can smoothly pivot to declaring that he was instrumental in preventing the apocalypse.
I kinda doubt it, I get that "When Prophecy Fails" isn't perfect but I think that doomsday cults typically just postpone the apocalypse when it fails to arrive, I'm not aware of any cult that went "we did it, we stopped the apocalypse". One would expect them to just say "we were right about the end times coming, just off by a few years", or you know, go full Jim Jones/ Aum Shinrikyo and just try to bring about the apocalypse. Probably the former in this case, thankfully.
This guy cults
Don't forget the part where they secretly saved the world from that apocalypse, but what's this, now there's a new one and now only donations to MIRI-2 will save the human race!
Dont worry, you can prevent being tortured forever by buying indulgences is already very catholic church tbh. But hey, those hardcopies of hpmor are not going to print themselves.
[deleted]
They used to think that the critical year would be 2005
Source?
It was in his piece [Staring Into the Singularity](http://www.fairpoint.net/~jpierce/staring_into_the_singularity.htm) (first written in 1996 when he was a teenager), where he wrote that "The generally accepted estimate has been and remains 2035 - less than forty years! - although many, including I, think that the Singularity may occur substantially sooner" and then a bit later added: >In truth, I don't think in those terms. I do not "project" when the Singularity will occur. I have a "target date". **I would like the Singularity to occur in 2005, which I think I would have a reasonable chance of doing via AI if someone handed me a billion dollars a year.** I would really, really like the Singularity to arrive before nanotechnology, given the virtual certainty of deliberate misuse, misuse of a purely material (and thus amoral) ultratechnology powerful enough to destroy the planet. You cannot just sit back and wait. To quote Michael Butler, "Waiting for the bus is a bad idea if you turn out to be the bus driver." >The most we can say about 2035 is that it seems like a reasonable upper bound, given the current rate of progress. The lower bound? Thirty seconds. Yudkowsky later [disavowed](https://web.archive.org/web/20070613185149/http://yudkowsky.net/) some of his teenage writings, but in his book *The Rationalist's Guide to the Galaxy* Tom Chivers reports that Yudkowsky has just shifted the date of his upper plausible bound by 26 years: >Yudkowsky, too, is on record as predicting that HLMI is more likely sooner rather than later: in 2011 he said on a podcast, ‘I would be quite surprised to hear that a hundred years later AI had still not been invented, and indeed I would be a bit surprised … to hear that AI had still not been invented 50 years from now.’7 I asked him if that was still his position, and he told me: ‘If Omega [an all-knowing alien AI, and a staple of Rationalist thought experiments] told me for a fact that AGI had not been invented by 2061, I would first imagine that some civilisational collapse or great difficulty had hindered research in general, not that the AGI problem was naturally that hard. I think it's probably revealing of some subconscious self-doubt that he's chosen a date that's likely near the end of his life if there's no singularity or nanotech immortality (in 2061 he will be 82), but there's a decent chance he'll still be alive, and many sneerclubbers probably will too, so at least there's that to look forward to!
!remindme 40 years
> climate change I mean, judging by current discourse, they would probably join the "is climate change an existential threat if it doesnt kill everyone?" debate
pretty sure they've said that before

Lol lots of hilarious drama happening in the comments. Scott, Eliezer, and Aella attacking the poster is NOT a good look. Sometimes I think they might not be all that rational after all…

They explicitly regard emotions including shame as irrational
Aella: > First, I’m annoyed at the timing of this. The community still seems in the middle of sensemaking around Leverage, and figuring out what to do about it, and this post feels like it pulls the spotlight away. ha ha wow
It's a straight Joe Paterno inferno of looking the other way over there.

[deleted]

[deleted]
I've never a seen a more clear cut example of the difference between "niceness" and genuine empathy. If someone runs a cult and gives your friends psychotic breaks, you should dislike them!
gee scott i wonder what he'd say to you if he thought you'd turned him down for sex
Isn’t he asexual?
my point is that vassar doesn't give a shit
What did you expect from the 'neonazis welcome as long as you dont openly say you want to do a genocide' person?
> I want to clarify that I don't dislike Vassar, he's actually been extremely nice to me, I continue to be in cordial and productive communication with him, and his overall influence on my life personally has been positive. ... I don't think he does the psychosis thing on purpose, I think he is honest How are these thought-leaders so calculating and canny when it comes to easing people into fascist ideology, but also the biggest, easiest marks to ever walk the Earth?
Yea I think you’re getting at something that I haven’t been able to articulate hitherto, which is that it seems like it would be really easy to scam these scammers given their internal rules about being rational. Like if instead of devoting your life to rationalism you devoted it to scamming rationalists you could probably make alot of money…orrrr is that what the thought leaders are already doing? Lawl
Half the stuff we hear about ratios these days is them getting into ridiculously obvious cults run by the sketchiest people alive. And after The Email by Siskind was revealed we know at least one of the major thought-leaders is in the business for the sake of turning people onto fascism. So yeah if you tried and were an unscrupulous type you could absolutely take the ratios for everything they're worth.
Alas Another get rich scheme I lack the constitution for…perhaps not the part of being so unscrupulous, but rather, having to deal with them all the time lol

(I would, if asked to take bets, have bet strongly against [me being actually assasinated by a MIRI exec], but I did fear other responses.)

This sentence unintentionally shows exactly what is wrong with belief-as-betting.

Look, if she lost the bet, I'd simulate a lot of happy copies of her, which, frankly, is another reason why the bet is so weird.
Great, you created a new basilisk, where you should secretly hire assasins to kill yourself so your copies get eternal bliss in agi heaven.
yeah i mean it'll be like homestuck or something

[deleted]

"i think....and this is just an idea.....we shouldn't have an organization which commits serious felonies as part of its normal everyday operation"
some kind of organisation for losers
I mean, "taking psychedelics" shouldn't be a fucking felony in the first place, and it's completely orthogonal to why all of this stuff is very bad.
i agree, but there's a reason that enforcement is very different on individual persons than on organizations, who, say, order people to take them

All jokes aside, the suffering that the post describes is horrible, and lends credence to the idea that the Rationalist scene is set up to suck in, traumatize and betray people with honest intellectual curiosity.

Jokes no longer aside:

I, in fact, asked a CFAR instructor in 2016-17 whether the idea was to psychologically improve yourself until you became Elon Musk, and he said “yes”.

Psychologically improve yourself until you, too, have apartheid emerald money!

learning primarily to comply and to play along with the incoherent, shifting social scene (there were mandatory improv classes)

OH FUCK BOJACK HORSEMAN IS PROPHECY

> Psychologically improve yourself until you, too, have apartheid emerald money! Why is it always Elon Musk? Of all the tech billionaires, he's the one who seems to have the lowest level of technical competence. Pick Bill Gates or Steve Jobs or someone who managed to at least have some meaningful degree of technical expertise to go with their enormous privilege.
Elon Musk has put a lot more PR into making people think he's IRL Tony Stark for some reason, and what these dorks *really* want more than anything is to be the comic book playboy genius who doesn't even have to try to be successful. Bill Gates and Steve Jobs don't have that
As much as Elon is just a trust fund puppet with more PR, I do think he parties in a way that is more relatable to the masses. He likes Burning Man orgies and uses social media to scam investors for his own profit. Much more relatable than stealing Xerox IP with the skills you picked up at your fancy private school and then partying on Epstein's Island or doing juice cleanses while applying calligraphy to how you bully your board room or whatever.
> Elon Musk has put a lot more PR into making people think he's IRL Tony Stark for some reason I mean, it's been working pretty well for the bastard for what, a decade?
Elon Musk is *cool*. Bill Gates is a dorky guy in his 60s who makes Powerpoint. Steve Jobs was pretty cool, but he's been dead for a decade and wore turtleneck sweaters. Jess Bezos is a little cool, but he's also kinda awkward and his rocket looks like a dick. Elon Musk talks about anime and was dating a hot woman 20 years younger than him and builds rockets and cool cars. Also, all that PR makes it socially acceptable to like him in EA circles -- "he's improving the world, *man*, he's gonna take us to Mars!"
> Elon Musk is cool. Ugh. I mean, I get it, but none of these guys are cool. Wozniak is probably the coolest, Gates is up there because he earnestly doesn't doesn't give a fuck and knows he doesn't have to. Musk is such a try-hard that it's embarrassing, and the thing he and Bezos have with intrastellar travel is the epitome of anti-cool for proper nerds, who know you need to spend your billions developing an arcology on Earth before Mars becomes plausible. Musk is only cool for me as a 14 year old. At this point dating Grimes is a negative, not a positive. It's all just gross, the rich have such bad taste.
>who know you need to spend your billions developing an arcology on Earth before Mars becomes plausible. Just reading these words made me want to join you in a group village where you do pruning sessions on my mind to bring me into alignment so that we can save the world. But after a few years I'd leave and write about how I should have known you were starting a cult for the Illuminati because you thought Epstein's friend Gates was up there.
What I *think* I take from that is that I'd be a superior cult leader because I'm a superior nerd. Honestly it's kind of vague. But for real though, why are there fewer nerd arcology fantasies?
Probably because a lot of nerds don't like the idea of being cooped up with lots of other people that they can't get away from easily.
Cyberpunk (ultra-urban) nerds are my friends. Zombie apocalypse nerds are my enemies. I don't even care if I become a zombie, as long as I get to eat their brains.
poor people can profane the arcology
This is also why they want sea cities instead of revisiting Rajneeshees
You're a superior nerd to all these "build an escape pod while life support is in critical failure" nerds. But it would fall apart because you don't know the Tal Shiar have impersonated the Captain. As far as the answer to your question, The black magicians have put demons in the minds of nerds to hate Mother Nature and seek escape from her clutches. Also Bannon got all the Biodome data and the academics were embarrassed a bunch of hippies beat them to the punch so they've been suppressing their accomplishments out of spite.
Rich having bad taste is something I firmly believe on par with the idea the sun will rise tomorrow and if I drop a ball it will fall to the floor. The fact that no billionaire has yet created a pool of jello and jumped into it tells me all I need to know.
Wozniak actually has a track record of technical achievement. Grimes is an insufferable hot goth chick, but not more insufferable than many I've gone out with. And Musk didn't just get himself the insufferable hot goth chick, he got himself the insufferable hot goth chick who was so goth she was literally a 4AD recording artist.
Uh, "many"? Stop going out with insufferable goth chicks. (Like, dude, learn your lesson.) And just because someone wears black doesn't make them goth. What does 4AD mean?
perhaps you could look things up before posting to say you haven't looked them up
lol I cannot believe you downvoted me out of spite. It was a teasing post from someone who is fond of you from SA. Jeez, man.
Elon comes off to me as a rich tryhard poser, I dont think he is cool myself. He is cool if you liked ready player one I guess. (But as I am also very far from cool no idea what that implies).
I should say, I think Elon is a lot of rationalists' idea of cool. Like, if they got a genie, I think "billionaire who flies rockets, shitposts on twitter, and has a hot girlfriend who watches anime with you" would be their dream life.
>He is cool if you liked ready player one I guess Absolutely savage
But like…at least stark saved the world a bunch of times hahahahahahahhaha
I think we need less of all of those types and more Steve Wozniak types.
> Of all the tech billionaires, he's the one who seems to have the lowest level of technical competence. Mhh no? Though I guess his special status comes from either just being the most recent one, or because he's more into engineering (read: big beefy machines) than software. > Pick Bill Gates or Steve Jobs or someone who managed to at least have some meaningful degree of technical expertise What? I cannot think to anybody less technically competent than jobs, which was just an absolute madman in marketing. Gates to be sure was pretty good with programming, though I guess he preferred to spend his later years into business school and how to consolidate a monopoly.
[deleted]
The iPhone really didn't do anything new UI/UX wise. webOS and Maemo at the time had almost all of its features, and both iOS and Android stole the card metaphor from webOS when it comes to multitasking. Slate phones and tablets were a thing then, too.
LG also managed to score a pretty sleek touch interface (on a capacitative display of them all) literally weeks before the first iphone shipped. But these were successful because the north american market was basically hostage of carriers, and the hypomaniac managed to strike a "let me do my thing" agreement with them (that made their fortune on the dawn of the app age, when after a year his immense wisdom allowed 3rd party applications to exist, in a centralized place). Also [this](https://en.wikipedia.org/wiki/File:Waiting_for_iPhones_NYC.jpg) I guess. While nokia still had the best smartphones overall, but their phone and PDA divisions were fighting each other until elop eventually killed them both.
> their phone and PDA divisions were fighting each other until elop eventually killed them both Any sources to learn more on this?
https://old.reddit.com/r/linux/comments/7wwo9z/long_read_operation_elop_the_final_years_of/ https://web.archive.org/web/20130709132057/http://taskumuro.com/artikkelit/the-story-of-nokia-meego
Thank you
Duty. I cringe for muricans having missed the awesomeness of the N95.

Despite the witch hunts and so on, the Leverage environment seems more supportive than what I had access to.

Wow, that’s really…special…also the sort of thing I’ve said before.

Perhaps one lesson to take from Zoe’s account of Leverage is that spending relatively more time discussing sociology (including anthropology and history), and less time discussing psychology, is more likely to realize benefits while avoiding problems.

Took long enough. There’s a semi-encouraging theme through this of realizing the problems with AGI obsession and discussing more concrete issues. I wonder how many rationalists this represents?

I'm very pessimistic. This is not “Advent didn't happen, therefore I'm an atheist now are at least go back to my normal parish” type situation, this is a “Advent didn't happen, therefore I'm a Seventh-day Adventist now” type situation. The only healthy exit from the Rationality cult is “all of this is garbage from start to finish, I'm gonna cut all ties with these lunatics, find a normal job, live a normal life”, not this weak-sauce nonsense.
QAnon for Nerds

Shit, I knew they were insane, but I didn’t realize they were this insane.

I think the rule of thumb for cults is that it's always at least twice as crazy as it looks from the outside, regardless of how crazy it looks from the outside.
*ahem* we toooold you sooooo

[removed]

I object to your characterization of mental illness.
[removed]
If a cult abuses a person until they have a psychotic break, that doesn't make the victim's testimony invalid. To suggest that this is the case would be, in the words of Abraham Lincoln, "fucking stupid." This is intuitively obvious to the casual observer. By the way, your case is not helped by your weird fetish for awkwardly aping an approximation of the scientific style in most of your reddit posts. P.s. You posted unironically in lockdownskepticism. Everyone point and laugh!
Hey you got a source for the abe lincoln quote ? Lmao
[removed]
You are a terrible person. Please fuck right off.
[removed]
> Did I get that wrong? Yes
This idiot is so fucking banned No fucking idea what their deal is but they can join the rogue’s gallery of weirdos who think SneerClub is their personal therapist
I appreciate you 👍
Your prose style swings “sometime-psychotic” and you are very banned.

Examination Diagnosis Prescription Prognosis This infohazard exploit is designed to overwhelm the far more normative self-examination and diagnosis phases of self care. The “prescription” of psychopharmaceutical abuse is also blatantly destructive to good mental health. If these are intentional, they’re evil, and even if the potential for harm is unintentional, this is a good reason to question the mental health of those recommending them.