r/SneerClub archives
newest
bestest
longest
137

epistemic status: long-time sneerer, first-time poster

Apologies for effortposting in a place that’s clearly meant for shitposting (not that there’s anything wrong with that), but this seems to be the place to talk about these things and I’m curious what y’all think.

I’m mostly an outsider to the Rationalist community (capitalized for reasons below), having come to it through a rare portal to the outside world: someone posted a Slate Star Codex link in an unaffiliated place, so I started folllowing the blog, and then (regrettably) the subreddit. It sounds like what I went through is a pretty typical life cycle:

  1. Wow, these are really informed people who hold really interesting discussions, despite the occasional weird jargon and strange selection of recurring topics.
  2. Jeez, I’ve never seen someone say such a horrible thing with no apology or qualification; interesting that the rest of them don’t seem to mind.
  3. I never thought I was the kind of person to do it, but this seems like a nice congenial place to hold a discussion so I guess I’ll take a polite stand for basic human decency that almost everyone will agree with, since no one else has done it yet.
  4. Hm, there’s a lot of pushback, but maybe it’ll build character to have dialogues with these people and learn how they got to be like this.
  5. Well, at least it’ll sharpen my debating skills and help me realize what I believe in.
  6. Well, somebody has to die on this hill so at least we can say the good guys went down fighting.
  7. Ugh, this is turning me into a hostile crank and I’m sure I’m not changing any minds, so maybe I’ll just let the snowflakes have their safe space and quietly lurk for the less toxic topics.
  8. I don’t actually care what these people think about any topic or put any trust in the information that they seemed to possess, so I’m not learning anything here except that some people are too far gone and talking to them is fruitless, which is the opposite of what I expected.

I still follow the SSC blog because some of the apolitical posts are very good, taken with the right dose of salt, but there’s no entropy in guessing whose side the author will take any time he gets triggered by a hurtful slur like “r-cist” or “s-xist” or such. Early in the above lifecycle my friends were sick of hearing me recommend SSC to them; in the middle I stopped mentioning it to anyone because I was afraid someone who doesn’t know me well might make the wrong inferences about me; by the end my friends were sick of hearing me bitch about the white nationalists I argued with online. (It’s thanks to the SSC subreddit that I learned the difference between white nationalists and white supremacists.)

But aside from NEWS FLASH: BIGOTS ON REDDIT, my experience with this community has actually started to change my mind in some areas where I used to agree with them. Now I’m not so sure that liberalism is the perfect solver of every problem, that every controversy can be fairly and efficiently decided if we just enforce free speech, that if we respond to bad ideas with better ideas the latter will win and the truth will out, that thoughtful discussion among reasonable people will tend toward mutual understanding. Here we see the steelmen are running the asylum. Does every bad idea deserve to be discussed? Should Nazis be debated or punched? I used to take the debate bait but now I worry about how, if we make it out of this thing alive, those of us who didn’t punch the Nazis will live with ourselves. I always knew the openly hateful ones were monsters, but now I’ve gained a new disdain, as Dr. King warned us, for the well-meaning moderates who tolerate and enable and normalize them.

So this is my little personal crisis of belief: Does a philosophy like Rationalism/liberalism have inherent flaws and tendencies toward making us more certain of our preconceptions? Or did this community just get how it is because of which people happened to be in it?

Here are some traits that stand out the most:

American white-nationalism, sexism, LGBT-phobia, etc.

Thanks to SSC’s reader survey (which I participated in and semi-analyzed), we know the audience is mostly American, almost entirely white, and almost entirely male. Maybe eugenics and GamerGate and the Fourteen Words are just what white American men naturally talk about when they’re alone? I think this was my first really controversial comment in that subreddit (which caught me by surprise because I didn’t even mean it as a scold! …yet). If all of my friends were white American men, I might not have any empathy for other people’s experiences either. Or any idea of what it means to belong to a particular culture - even in movies and on TV they rarely have to go outside their own Default Culture. Or any notion that there might be other ways of having a government/society/country that are different from theirs and in some cases much better. To a normie it might seem like nothing worthwhile could possibly come from a bunch of white American men discussing the big issues of race and gender and international affairs among themselves (so why did I try for so long?). But maybe if you’re really dedicated to pure Rationality, you close your mind to all outside information from less enlightened beings because you know you can solve everything from first principles yourself - and untethered from reality, your first principles gravitate on average toward exactly the kinds of prejudices someone from your part of the world would have anyway. (You might think rationality means overcoming bias, including bias against groups of people, but, apparently, nah.)

The IQ dick-measuring contest

This one I don’t really get. Why should a community that’s all about building skills in something fetishize an inborn talent for that thing instead? Does it simply follow from the aforementioned focus on “race realism” and how IQ became the focal point of that in the 1990s? I’m not sure that explains it all of it. One redditor was horribly dismayed by getting a perfectly normal result on an IQ test and (a long time later) Scott Alexander wrote a whole thing downplaying those worries, sort of. I didn’t really get a vibe of race panic. But in that blog post and the original thread, there’s plenty of comforting speculation that the test must have lowballed it because OP is a programmer who uses correct grammar and reads, so a rigorous disinterested trust in the validity of psychometrics clearly isn’t involved either.

Artificial intelligence as existential risk

What does AI even have to do with Rationalism? The only strong connection I see is that AI is both very hot among programmers and very lucrative, which is good news for someone trying to run a donor-supported nonprofit organization whose fans are programmers. I guess the fans are programmers because the community dates back to when the WWW was for early adopters (I first e-met Eliezer Yudkowsky on a webforum ca. 2000 when his shtick was transhumanism). If they had been mainly from some other field, would they all be losing sleep over the “climate change hypothesis” instead? Founding research institutes to prepare us for contact with malevolent spacefaring extraterrestrials? Donating to build an anti-meteor gun? The problem with infinitely bad risks is that there are so many competing ones.

Incidentally, amid all this futurism I don’t see much concern about the present, in which sketchy tech giants like Farcebook are already killing the world to feed their fledgling AI. Maybe because so many readers work there. (The SSC survey’s “Profession” question has three different categories for Computers, though almost all the respondents are the plumbers rather than the architects; for a good time, compare that list of categories with the BLS’s, but then read on.)

Anti-intellectualism

This encompasses not just the typical American skepticism of scientific authority when it doesn’t seem to get the answers you want (e.g. climate change, evolution, gender spectra are just cultural-Marxist and/or postmodernist brainwashing) but also a more fundamental philosophical view that academia does not have a valid way of understanding the world. Every once in a while, someone with real credentials in a trendy domain like genetics or economics will drop in to mention how jarring it is to see so many people talking so enthusiastically about their academic discipline with such esoteric vocabulary, yet the vast majority of it is utter horseshit that wouldn’t pass a 101-level class. One response I got when I did this was that someone, with no apparent irony, accused me of “LARPing” because the scientific establishment is clearly just pretending to epistemological “prestige” that can only be truly earned by studying the Sequences. PhD < Bayes’s theorem (< IQ).

This is, of course, the perfect description of what the Rational community is up to. Instead of labs they do their research in armchairs; instead of peer-reviewed journals they publish their findings in blogs (whose posts still get actual citations years later). But they’re creating a parallel world of alt-academia in fields that are already well trod by the genuine article, like philosophy and economics and quantum mechanics and oh-so-much genetics. They do happily admit real-world publications into their body of knowledge, but then they also admit pseudoscientists like that Kirkegaard guy or the crackpot virologist whom Peter Thiel paid to give people herpes in order to prove we don’t need the FDA. I think this is where Rationalists are the most cultlike and earn their capital R: not the abundance of unnecessary jargon/shibboleths, nor the alarming tendency to handle everything in their daily lives (even their health) through the community, but the whole ecosystem they have of theories and thought-leaders that are constantly discussed inside the community yet no one outside has ever heard of them.

Maybe this comes back to the evasion of empathy, the reluctance to give any weight to other people’s experience - a doctor’s opinions about health are just as irrelevant as an African American’s opinions about racism. In that sense it could just be one more battleground in the eternal conflict between rationalism and empiricism.

On the other hand, maybe this more than anything is where the loudest personality dominates the culture. Eliezer Yudkowsky is famously an autodidact, which is to say he never finished high school, and he holds a very explicit grudge against people who got their credentials the old-fashioned way, but he sure doesn’t need any to form his own worldview (his Facebook feed is a hoot, though a bit redundant with r/iamverysmart, sometimes verbatim 😉, e.g. after his incoherent essay about the Catalonia crisis the top reply had to inform him that the people there are called Catalans). (You might think rationality means being aware of the Dunning-Kruger effect, but, apparently, nah.) Yudkowsky also has a pronounced libertarian bent. The toxic personality of the leader might be what inspires or attracts the same in his followers.

V.

So. Some of these quirks make sense as following from the community’s demographics, but some might just follow universally from empowering yourself with critical-thinking skills and then getting a little overconfident. I bet a Rationalist culture that evolved out of HIV-superstition skeptics in sub-Saharan Africa or ex-Muslims in Indonesia wouldn’t tend toward all the same groupthink as the one that came from early anglo-digerati. But how many of these would still be natural endpoints for convergent evolution? Is the problem with Rationalism its Rationalists or its Rationality?

^(P.S. Does anyone know how to selectively delete all my comments in a specific subreddit, before the SPLC finds it?)

For the right wing stuff, I think it’s just a nasty side effect of how cultural values and attitudes aligned themselves in the culture wars. Left wing social progressivism has an inherent focus on empathy and lived experiences, with an increasing focus on mental health, and is mostly spearheaded by activists. Since our culture perceives logic and reason as an opposite to emotion progressivism becomes kind of… icky to people who fetishize the former. Right wing ideologues have of course picked up on this and turned framing centre-right/right as the bastion of logic and reason, and left wing progressivism as bleeding heart emotionalism that is for “pussies” into a massively influential cottage industry infested with pop-intellectuals who know how to play the part of the intelligent rationalist for an audience interested in that sort of discourse but not interested enough to fact-check.

Then confirmation bias takes over and it’s all down hill from there. Social justice is over-emotional, therefore it is illogical, therefore the centre-right is even more logical. Hey wow, eugenics wants to breed humans into a race of geniuses, now that’s what I call logical!

And the problem with rationality, aside from it’s obsession with logic as an ideology in of itself, is that confirmation bias is far too subconscious for some podunk philosophy you picked up off the internet to deal with. But it’s pretty good at preventing you from viewing those subconscious biases through a critical lens.

You said in your above post that "rationality" is 'pretty good at preventing you from viewing your subconscious biases through a critical lens'. SSC was actually the first place I saw a pop writer introduce subconscious cognitive biases as a topic, attempt to critically examine them through the lens of current national discussions, and then rather than lecture me on what I ought to believe, pointed me towards ancient and contemporary philosophers, doctors, scientists, economists, and social researchers- like Daniel Kahneman, who along with Amos Tversky, originally coined the term 'cognitive bias'- for more reading. Incidentally, the term 'rationality' arose from the Kahneman/Tversky experiments in rational decision-making, as explained in Kahneman's book "Thinking Fast and Slow", rather than as a renewal of Enlightenment-era Rationalism. Anyway, re: right wing ethno-nationalist bullshit, I think it's overrepresented in the online community because the rationalists are the only ones that bother to actually debate these people instead of immediately exposing them and casting them out.
Good for you, you learned to detect the subconscious biases that can be cleared with rationalism. Not what do you do with the biases that are don't conflict with rationalism or are reinforced by it? Or the ones you can rationalized as being rational? >Anyway, re: right wing ethno-nationalist bullshit, I think it's overrepresented in the online community because the rationalists are the only ones that bother to actually debate these people instead of immediately exposing them and casting them out. Well, you know. The problem here is self evident. The 80s-90s punk scene knew what to do about fascists. Punch that shit out, it's the only thing they understand.
Which biases are those?
Exactly

Three things I’ve noticed:

The sheer incuriousity. Can’t even Google, much less really dig for arxiv or openaccess or uh, less open access. Won’t follow trails.

No sense of history. Philosophy stopped with Plato. No understanding of what conditions and events various older thinkers (Hume, Foucault, Aquinas, Confucius etc.) were dealing with.

Very provincial, in that a common upvoted stance is: OF COURSE the “blue tribe” dominates any and every media conversation and a place is needed for intelligent pushback, also my childhood and education in the Bos-Wash axis and current career in SV mean I’ve experienced the full spectrum of American life.

Finally, rationalism is a wonderful and useful tool, but you can’t use rationalism to determine what rationalism should be used for. Maximize human flourishing? Minimize human suffering? Ensure humanity’s survival long term?

The IQ dick-measuring contest

This one I don’t really get.

Yudkowsky.

Artificial intelligence as existential risk

What does AI even have to do with Rationalism?

Yudkowsky. He literally wrote the Sequences to try to bring people to Rationality so they would believe him.

Basically, a lot of the weirdness is history. Remember that this is not SSC’s community - it’s LessWrong’s community, latest iteration.

Agreed. As of right now, Scott Alexander - Yvain on LessWrong - is responsible for [ten of the twenty-five most upvoted posts on that group blog, including three of the top five and the single most popular](http://lesswrong.com/top/). He is(/was, the community is way down in activity from its peak) a LessWronger, and a large part of the community at his blog is drawn from there.

I think it all started with transhumanism. Transhumanism naturally opens up the door to eugenics and IQ improvement. Now combine that with the demographic that stems from silicon valley techies and you get the rationalist community. Then you have the nrx stuff that comes from the exact same demographic which pushes the community even further.

Incidentally, I think that any community that tries to market itself as “rational” will have the same conclusion, resulting in an echo chamber. If you fetishize rationality and then convince yourself that you have the monopoly of it you will eventually become less and less “rational”. If you get a bunch of high IQ white men with similar backgrounds in a community that self identifies as rational, you will get the same result almost every time. You have to take certain explicit policies to avoid this.

Like take yourself, for example. You were one of the people that differed from them politically, and now you are out of the community. How many other people like you are there? I bet quite a lot. You can imagine how this makes a community more and more ideologically homogeneous in a very specific direction.

> Like take yourself, for example. You were one of the people that differed from them politically, and now you are out of the community. How many other people like you are there? The thing is, I'm not sure I differed from *most of them* politically. There are certainly some very loud voices calling for ethnic cleansing in the US (with no apparent irony), etc., but I don't think they're the majority. Where I realized I differ from the rest of that community is that I don't think hanging out with white nationalists is fun, I don't think ethnic cleansing is a stimulating topic of discussion even for people who disagree with it (or even something to simply ignore if you're not interested), I think there are more important things going on in politics than which professional troll wasn't allowed to incite hatred at which public university, and so on. As they like to say, the problem was at the meta level more than the subject level.
yea my bad, I meant more so *ideologically* rather than *politically*
tl;dr it's all the Extropians' fault. (Bitcoin is too btw.) > You can imagine how this makes a community more and more ideologically homogeneous in a very specific direction. [Evaporative Cooling of Group Beliefs](http://lesswrong.com/lw/lr/evaporative_cooling_of_group_beliefs/) - as usual, Yudkowsky's somewhat insightful posts are never applied to his own beliefs or his own communities.
I mean if you think about it in retrospect, it's pretty obvious. Grab a bunch of rich privileged libertarians (or adjacent) and form a community on the basis of "rationality", they will eventually end up thinking that they are at the top because they inherently deserve it and anyone else that isn't, it's because they don't deserve it. And of course because they have the superpower of rationality (cf. Harry Potter's wizardry) they have to be right. > Evaporative Cooling of Group Beliefs - as usual, Yudkowsky's somewhat insightful posts are never applied to his own beliefs or his own communities. But in their case it doesn't matter, because they are rational, and the best ideas will win in the end.
>opens the door to eugenics By which you mean, literally coined by a eugenicist. http://journals.sagepub.com/doi/abs/10.1177/002216786800800107?journalCode=jhpa
[yea, sort of](https://lareviewofbooks.org/article/silicon-valleys-bonfire-of-the-vainglorious) >But where Bernal and Huxley envisioned biological transformations that could potentially benefit society as a whole, this new cult of transhumanists, death defeaters, and allied techno-enthusiasts focused on the self: the perfection of body and mind as individual self-fulfillment. In California, the net and nanotechnology met Narcissus.

Neoreactionaries flock to anywhere they aren’t outright banned, and LW/SCC generally believe in free speech. Engaging with the comments on SCC is not recommended even if you like the content.

Read through this great post, then scrolled up and saw the username. Huh. I really admired your efforts trying to persuade people on r/ssc. You really did all you could, it seems :(.

So far as I can tell, the problem was the people… sort of. There’s still a lot of stuff in the Sequences that I really appreciate knowing. The only one I ever entirely disagreed with was the Fun Theory sequence, which wasn’t really in the same vein as the others. But all the people who could take that knowledge and leave, did. The remainder were the squabblers. Not necessarily incompetent, but far more fascinated by seeing what online argument they could pick at today than any actual goal. That it eventually became political was inevitable. Perhaps it didn’t have to end as badly as the This Week In Stupid threads that dominate the sub today, but it seems like a common outcome. Right-wing views are better adapted for that kind of environment.

Or, to try to describe the problem with the people differently: if you don’t respond to nazis IQ-concerned pro-business strength-preferring ethnonationalists with “fuck off, nazi” then you are implicitly saying “fuck off, minorities,” whether or not you believe you are or would agree with that sentiment. This is a hard idea for remainder-Rationalists to grasp, because it’s very, very unfair that you have to choose one or the other, and very, very unfair that your actions will have meanings beyond what you deliberately try to imbue them with. Leaver-Rationalists - and I do include Yud in that category, for all his weirdness, he is at least doing something - have no such problems: they can actually implement the advice of “do what works to win instead of trying to console yourself with the satisfaction that you lost while following your code.”

Hey, man. I’m miss your posts over on /r/ssc (that’s how I found this actually - was looking at your user history, wondering where you went).

I’m sorry you felt your posts weren’t doing any good. I think it probably was, though. If nothing else it makes those of us who agree with you feel better, but also, I think you can change minds even if you’re not persuading the people you’re talking to right away (see The Unit of Caring on changing her mind.) And it’s one of the only places these people get pushback that they can’t write off immediately. It’s not like they’re going to go about their lives not thinking these things if you ignore them.

Anyway, it’s late, this is a bit incoherent, sorry. I just wanted to put in my appeal for liberalism and debate even if reasonable ideas aren’t winning yet.

[deleted]

Welcome! I'm a relative newcomer here too and I've only recently started participating beyond this post. But as I browsed older ones I gradually realized that it's not just me, and a lot of people have gone through a very similar life cycle, so I tried to capture the whole thing in one big screed and I'm glad people who've had the same thoughts are still finding it. Yeah, since I wrote this I've become even more convinced that the whole expansive Rationalist community, with all its diverse intellectual interests far beyond science and math, is LARPing academia. People are trying to be amateur polymaths, and even thinking they're coming up with brand new ideas about highly esoteric topics, in blog posts they wrote while their code was compiling. I definitely sympathize with that motivation, as an Actual Academic, and wish it could be that much fun all the time. But as anyone who's in academia knows, there are a lot more people than ideas, and the only way to come up with a truly original one is to spend decades of your adult life becoming one of the world's few experts in a very narrow slice of the field as your day job - even then it's hard. [This is the cartoon we show to every newly minted PhD.](http://matt.might.net/articles/phd-school-in-pictures/) With apologies to Sagan or Plato or probably others, human knowledge is a big dimly lit exploration of a dark cave. The first spelunkers just had torches and candles, and they didn't get very far or notice the fine details, but they mapped out a lot of the nearest chambers and we still use the names they gave to those areas. Subterranean cartography really took off once we had electric lights and an organized system to build a grid and keep the explored areas permanently lit. But the Rationalists are choosing to ignore all passages that aren't illuminated specifically by shiny new LED lamps (even the clearly marked signs that say *This tunnel is a dead end*), and starting their own new map from scratch, just cuz. I will happily agree that an LED is objectively better than an incandescent, and that's what we should use in new expeditions, but there isn't really an overwhelming need to go replacing every fluorescent bulb currently buzzing along the paths (especially not with their specific favorite brand of LED) and retracing all the routes when there's still so much darkness to chart. It might seem fun to be at the frontier of the (re-)explored world, but a lot of the time they're still wandering off the grid like the ancients, and falling down into the pits that they would have known about if they'd been able and willing to read the map.

Well for starters “rationalism” is an unrelated philosophical school they’ve kinda co-opted and “liberalism” is a relatively broad term in political philosophy, that to be fair, tends to get used to mean “Milquetoast center-left neoliberal with a vague commitment to respectability politics and pacifism” nowadays

Can confirm, am a milquetoast center-left neoliberal with a vague commitment to respectability politics and pacifism.
Gross
Talky well-meaning fluffy liberal of pissweak ideology here too. If I couldn't meme I'd be hopeless. *Maybe I am!*

It’s both!

I really like rationality, but I found reading Jay Heinrich’s “Thank You For Arguing” far more edifying than “AI to Zombies”- and despite being a bit academic in parts it really improved my social life. I already understood logic; I didn’t understand people.

Rhetoric is a discipline that recognises rationality as a single component of making a good argument (logos). The rationality movement irrationally elevates logos over pathos and ethos, when all three are critical components to communicating like a normal human person.

As a result as a movement it attracts people who like logos and who disdain pathos and ethos. Those are often but not nessecarily people who are good at the former and not nesecarily good at the latter. So we have people who either don’t want or can’t act in a normal way.

Tl;Dr Rationality doesn’t help you to not be a dick, so dicks like it because it doesn’t require them to change.

From my point of view, I discovered that the Rationalist phenomenon existed as a unified phenomenon after discovering /r/badphilosophy through my own pre-academic frustrations with Sam Harris (I guess I’m basically an academic now as well, ugh), having been vaguely previously aware that some of the key players existed but not that they composed an Illuminatum/us/i until somewhat later.

I don’t know if it’s a matter of my personal upbringing, my schooling, or my geographical location, but a common thread I’ve encountered in “discussions”, whether about Sam Harris, some similar figure or issue, or Rationalism, but it’s taken as a dogma, one that I genuinely wasn’t familiar or even especially well acquainted with until my early twenties, that freedom of speech is not only a worthy political ideal but is actually the substrate of political engagement and organisation. This ideal seems to hold that not only should you not be restricted in terms of what you can say (a worthy if defeasible ideal), but that the good can only be achieved if this is upheld - a very strange idea to me, and one that I had thought was only justified by some highly contextually motivated political ephemera by some British and American authors of the Enlightenment period and a fairly simplistic Utilitarian argument from Mill.

Interestingly, this turned to be a different dogma than I was familiar with. Growing up in a faintly Machiavellian political household and reading The Guardian and The Times, I had taken it to be a rhetorical tool, one which was employed by journalists in order to evoke a particular political tradition so that the political order might be influenced, by the pathos of its invocation, to give an ear to a particular notion that would not otherwise have been listened to.

But no, it turned out that people unquestioningly, for I never was furnished with any general argument on the matter that extended beyond an uncomprehending bafflement, held this to be an incontrovertible truth. The more or less Rousseauan but also, I think, human, counterpoint that a just society, and a just conversation, would be best achieved by mutual agreement about what is “on the table”, and that individuals are not merely individuals but participants in a fundamentally social process, one which admits of justified demands to “shut the fuck up, you moron, you don’t know what you’re talking about”, had not even been considered.

This plays into a rebuttal of the Rationalist game in the sense that Rationalists are supposed to have reasons for their beliefs, but as Hume observed of the Modern Rationalists (a group entirely different in their conclusions, but not necessarily in argumentative style, from these “Rationalists”), sentiment matters as much reason, as it does for everyone, and this is not necessarily any bad thing - so long as you acknowledge it and act on that as a fact of the matter at hand. Unfortunately this is not something I see recognised in the ahistorical and anti-philosophical aspect of the people there involved.

So I find myself chuckling at your experience, as a tyro of Rationalism, because it so easily reflects what I was bemused at, and continue to be bemused by, in the credulous supposition on the part of both yourself and the Rationalists, that surely open discussion of any and all matters is worthy of the good. Because it simply never occurred to me that that kind of discussion would always be honest, would always be genuinely truth-seeking, and would generally move, in an almost teleological manner, towards the actual truth, and towards the good. It also never occurred to me, as the scion of veterans of political bullshitting, that this was the necessary or sufficient substrate of any genuinely truth-seeking enterprise: it never occurred to me that a polite tone of voice was a necessary or more importantly sufficient condition of participating in a genuinely truth-seeking enterprise.

I think my own intuitions, attitudes, and upbringing - that is, those intuitions and attitudes, inculcated by my upbringing, which say that you can swear and bitch and moan as much as you like when you know what you’re talking about over somebody who doesn’t - have been roundly justified by the ridiculous and anti-scientific attitude taken as read by the Rationalist community.

i luv yu
> This plays into a rebuttal of the Rationalist game in the sense that Rationalists are supposed to have reasons for their beliefs, but as Hume observed of the Modern Rationalists (a group entirely different in their conclusions, but not necessarily in argumentative style, from these "Rationalists"), sentiment matters as much reason, as it does for everyone, and this is not necessarily any bad thing - so long as you acknowledge it and act on that as a fact of the matter at hand. Unfortunately this is not something I see recognised in the ahistorical and anti-philosophical aspect of the people there involved. That's pretty spot on.

epistemic status:

At this point I knew it would be good.

It’s the inevitable detritus of the Californian ideology and cybernetic totalism.

http://www.metamute.org/editorial/articles/californian-ideology

https://www.edge.org/conversation/jaron_lanier-one-half-a-manifesto

If they had been mainly from some other field, would they all be losing sleep over the “climate change hypothesis” instead? Founding research institutes to prepare us for contact with malevolent spacefaring extraterrestrials? Donating to build an anti-meteor gun? The problem with infinitely bad risks is that there are so many competing ones.

Possibly interesting: the Bay Area in the 70s, with a big community around space colonisation and nanotechnology.

I’m an AI safety guy because scale/neglectedness/tractability. All of these are pretty bad (not infinitely, because I’m mostly a person-affecting wimp), so you’ve gotta look at

  • Exactly how bad they are
  • What we can do about it

to decide which one to look at. (Also: you notice multiple infinitely bad things, and you’re not horrified that we live in a world where we don’t actually have the resources to deal with everything infinitely bad at once?)

Climate change is probably quite bad, with a few percentage points of x-risk, and several more of some seriously bad outcomes, but it’s not really neglected - everyone already cares about it. If I decide to become a lobbyist against climate change or something, then I join the thousands of other lobbyists nudging around the tens of billions that are already spent on it.

Malevolent spacefaring extraterrestrials are probably quite bad, but I’m not sure how bad - the Fermi paradox means it’s hard to say anything concrete about this. Somewhere between 0% and 100%. Nonetheless, I’m not sure if there’s much that can be done about this - if there’s already a Berserker probe swarm out there, then you can get an approximate remaining lifespan of the Earth by dividing distance by speed. This might be promising to work on, because there’s probably some game theoretic and decision theoretic considerations and some policy problems that are as-yet unsolved, and not many people are working on aliens.

Meteors are probably quite bad, but they’re not really that bad, when you look at just how few meteors there are. The meteor that killed the dinosaurs hit tens of millions of years ago, which means that we’ve got ages. I could do nothing about this, and my grandchildren could do nothing about this, and in a thousand generations the posthumans could do nothing about this, and that wouldn’t be that dangerous.

AI is probably quite bad, also with a few percentage points of x-risk. And it might actually be something that people can do something about: it’s “just” a software problem that can be solved by thinking about it for enough time (and if safe AI is actually impossible, that might be important to know). And there hasn’t been much work on it so far, anyway, so there’s a reasonable chance that there is new stuff that hasn’t been uncovered and low-hanging fruit.

If you find something that is a bigger deal (in the sense of a single new researcher could do more to improve the future working on it) than AI safety, then there are quite a few people who would be interested, and I recommend you write up a report with a quantitative model and stuff. You could get enough people on board to actually save the world!

(For example: you brought up present day AI and hostile extraterrestrials as problems. How much harm are these doing? How many people are working on preventing the problems? How much extra good would one person be able to do? If the answer to these questions is a lot/not many/a lot for either of them - which it may well be - then that would make it very pressing, and it might be useful to spend a few hours putting numbers on things.)

Wow, these are really informed people who hold really interesting discussions, despite the occasional weird jargon and strange selection of recurring topics.

Jeez, I’ve never seen someone say such a horrible thing with no apology or qualification; interesting that the rest of them don’t seem to mind.

I never thought I was the kind of person to do it, but this seems like a nice congenial place to hold a discussion so I guess I’ll take a polite stand for basic human decency that almost everyone will agree with, since no one else has done it yet.

Hm, there’s a lot of pushback, but maybe it’ll build character to have dialogues with these people and learn how they got to be like this.

Well, at least it’ll sharpen my debating skills and help me realize what I believe in. Well, somebody has to die on this hill so at least we can say the good guys went down fighting.

Ugh, this is turning me into a hostile crank and I’m sure I’m not changing any minds, so maybe I’ll just let the snowflakes have their safe space and quietly lurk for the less toxic topics.

I don’t actually care what these people think about any topic or put any trust in the information that they seemed to possess, so I’m not learning anything here except that some people are too far gone and talking to them is fruitless, which is the opposite of what I expected.

lol owned

This was my experience, too, except steps 1-3 were my friend and steps 4-7 were me

Labor aristocracy.

edit: Hi I’m MarxBro btw, your username looks familiar Epistaxis, did I ever get in any funny arguments with you?

Probably, because you get in arguments with everyone, but you might have been using one of your many alts because you kept getting banned. Hi.
My main alt was tankiegirl, I can't remember the other ones really. I'm a dang political prisoner! Actually I'm more like when Lenin got exiled.

I am offended that you think rationalists = people who frequent SSC. That’s not the central rationalist hub, https://lesserwrong.com is. SSC people discuss politics and race much more.