(idea shamelessly stolen from u/ValarPatchouli)
This is an experiment in seeing how much of rationalism is salvageable. It is not a thread for waging the culture war or for sneering at what other rationalists are doing elsewhere. We’ve got the rest of the subreddit for that.
I’ve gone ahead and seeded a few top level discussion starters, but feel free to add. I would politely request, though, that we avoid overtly political and current events for the time being.
Not that this thread isn’t a good idea, but I don’t know how much can come out of this: very little of rationality, particularly the instrumental rationality improve-your-life stuff that Yudkowsky ostensibly started with, is salvageable.
There’s a bit in Thinking Fast and Slow when Kahneman tells a story about when he went to an investment bank or hedge fund or whatever, and gave a talk proving that “None of your investors can beat the market. Anytime it looks like anyone can, it’s a statistical artifact and they always regress to the mean the year after.” He said after the talk was done, the bankers nodded politely, admitted that everything he said was true, and then he left and never heard from them again. Kahneman says he has no doubt that they believed him on some level, but the implications of what he said (basically, that this entire endeavor is a waste of time) were so big that they had no choice but to just pretend none of it ever happened and carry on doing exactly what they were doing.
The rationalist crowd is a lot like that re: instrumental rationality (which is deeply ironic, considering that Kahneman is the John the Baptist to Eliezer’s Jesus). Better rationality is probably not going to improve your life in any way; the major predictors of success at “achieving your goals” are almost certainly, socioeconomic status of parents, how charismatic/manipulative you are, sheer dumb luck, degree of education completed, and level of executive function; probably in that order or close to it. In everyday life, rationality is at best sixth, and probably actually has a slightly negative effect. And they know this! Scott pointed it out years ago. But just like the Kahneman talk, that knowledge was shunted under the rug because admitting it would mean admitting the whole thing was pointless.
But the truth seeped in anyway. CFAR has given up on instrumental rationality and pivoted completely to AI risk, Eliezer doesn’t seem to talk about it much anymore (although I don’t read his Facebook stuff, so maybe he’s still trying there, idk), and even outside the Culture War threads, all /r/ssc ever wants to talk about is politics and other people. The remaining attempts at self-improvement (e.g. Dragon Army) have nothing to do with rationality and are instead cargo-culting things that Bay Area nerds see successful people doing (fascism! the military! etc.). This is all a tacit admission that the original mission of “achieving your goals with rationality” has failed completely.
I brought up this quote before in one of the other Dragon Army threads, but I’ll mention it again: “Fanaticism consists in redoubling your effort when you have forgotten your aim.”, and that’s exactly what happened to the rationalists. Once self-improvement instrumental rationality was hopeless and there was no goal, it quickly sunk into the anti-SJ circlejerk it is now. (At least the online portion did. I’m told that the Bay Area IRL rationalist/poly social scene is a bit better and hasn’t totally descended into right-wing nuttery, but I don’t live there and can’t comment)
The community did have some problems and mistaken assumptions right from the beginning that helped lead into its current state (its hatred of lived experience as empirical evidence and love of contrarianism-for-its-own-sake both contributed) but I don’t totally agree with the claims that it was evil right from the word go. I think the biggest single factor in the shitheap the movement is now is more the aimlessness once the original goal was no longer viable but no one could admit it. And since, like I said above, instrumental rationality won’t help you much, trying to get the good parts of it back is just setting up for this to happen all over again.
There is a small domain in which better rationality might help: if you’re a scientist or researcher of some variety, please do read all the stuff about how thinking can go wrong, and then all the stuff about pitfalls in statistics and research that have made so many published studies worthless. Rationality will help you out a lot. But for working schmucks like the rest of us, there’s just nothing here but the insight porn.
You can’t tell me what to do
What the hell do you mean when you refer to “rationalism”?
Meta
Is this a good idea? Should this be part of SneerClub or a stand alone subreddit? What should the rules be?
Pancakes: too much work? Is the blessing worth it? They never come out right!
:D.
About design thinking: the first step is usually to talk to people and ask what they want.
The second is to ignore half of what they’re saying and get to nuggets of truth by observing them as well as you can.
It seems to me, however, that gauging the feel of SneerClub is a perfectly doable thing, that it counts as observing and that it has already been done to a degree. So let’s retrace and ask the unified questions; the answers might not lead straight-forwardly towards a solution, but they are always super illuminating when it comes to recognizing actual needs.
(I’m stopping now.)
So, it would be helpful to see short-ish answers to:
What did you like rationality for?
What did you dislike rationality for? (briefly: since we’re in SneerClub, it’s probably easy to get carried away)
What would you like in a new site?
well, “is scientific racism necessary to be truly rational?” is culture war-ish.
so instead, let’s ask: “is thinking lots and lots about torture as the Bad End necessary to be truly rational?”
so you know I wrote a book about why bitcoin is stupid, right. Well, it’s sold remarkably well - and is the first Createspace self-published book ever to get into the NY Review of Books! - but it’s been nearly a year, and sales are slowing down. So I really need to haul arse on the next book.
So that’ll be two books. One will be another bitcoin/blockchain book, because I have somehow got myself a second part-time job as a finance journalist of sorts. (People give me money for this, speaking gigs, consulting.)
The other will be the art project that’s nagged at me for years, and which Elizabeth Sandifer has been nagging me to do for years as well, particularly since I helped work on her Neoreaction a Basilisk: working title Roko’s Basilisk.
You might think this has zero sales potential. But! Tom Chivers - who is a really good science journalist, of the sort who you see his byline and think “this will be good and not suck” - has a contract for a book about the rationalists! (And yes, I’ve been feeding him research material and links. We figure they should help sell each other - one book is an oddity, but two books is a movement.)
[CONTENT WARNING: excessive fondness for TORTURE TROPES]
So I spent eight months writing 0, because I had no idea how to approach this. But I finally sat down and wrote 880 words of an intro, adapted from a Tumblr rant on transhumanists. And this weekend I wrote 320 words of an alternate intro … and started on the chapter handling Yudkowsky’s predilection for torture in his thought experiments.
Like, real life has no-win situations where some number greater than zero people are gonna suffer or even die, and it’s your decision - say, you’re a doctor or a politician: your job will involve this. So there’s serious ethical questions here, and philosophers working on them seriously, and their work does actually inform the decision makers.
The philosophers’ usual end point in matters of life or death is life or death, though. They don’t have the urge to … turn it up to eleven.
So there’s the notorious “Torture vs. Dust Specks”. But, even though the original Roko post positing the basilisk doesn’t use the word “torture”, the commenters, including Roko, went straight to that word to describe the Bad End. And it’s in other posts, and other people’s posts on LessWrong. It’s in the subculture’s collection of local tropes. Talking about torture is just normal for fearless idea-exploring Philosophy Tough Guys.
The trouble with starting a project like this is that you have to do the research. So I found this on Amazon for 99p.
so of course I hit libgenDark Lord’s Answer is something Yudkowsky wrote before A Girl Corrupted. He notes in the outro that this helped him get it together to write that work, and he has no idea if it’s any good but hey might as well put it out!
The story is a didactic economics parable in novella form, on the value of monetary policy rather than a rigid gold standard - “So you see, Prince, that you’re not being told to steal from your country of Santal. Even if, to save it, you must transgress the righteous rules against usury and adulterated coinage.”
So far so good. You know this literary form. Gets dull at novel length, but you can get away with it perfectly well as a novella.
But the blurb warns: “Content warning: sexual abuse, economics.”
A masochistic slave girl is the author mouthpiece, speaking in economics lecture notes - and several of the ethical dilemmas concern physical and sexual abuse of her. (Though none of it on-screen, thankfully. Though the loved one points out that then drags the reader in - as you’re put in the position of imagining it for yourself.)
The Prince is disparaged as too virtue-oriented and insufficiently consequential in his ethics to save his country’s economy, based on his responses to these ethical dilemmas - e.g., not taking up the offer of abusing her.
But it’s OK, ’cos she’s consenting!! (This is an actual excuse a rationalist made when I and others pointed out that this work’s sexual psyche has a number of issues, e.g. the misogynistic overtones. At which point he demanded explanation of what could be misogynistic about it.)
It reads like a kinked-up version of the Sequences. Gratuitous sadomasochism for flavouring, and to provide Yudkowsky’s favoured style of Philosophy Tough Guy ethics test. Something edgy.
(At least the economics is reasonably normal stuff - though that makes the torture porn more jarring, not less.)
While you can say “ehh, his kink is his kink” and that’s probably fine amongst consenting adults … you’d have to be as tone-deaf as a rationalist not to have a fucking shitload of red flags spring up at this particular fearless exploration of ideas - and how fearless exploration of this particular idea just keeps showing up in his nonfiction as well.
So not the greatest or most charming work. Two stars out of five, ’cos it’s grammatically accurate and spelt properly, the plot is coherent and makes sense and it’s not quite one-star bad.
But I can’t just pretend it doesn’t exist, when I’m talking about “Torture vs. Dust Specks”, and I can’t not talk about that essay.
So my problem is now: how the arsing fuck do I write this, without repelling any reasonable human from wanting to read it. I ran the chapter past the loved one and she said “It’s well written, but I don’t want to read about any of these people ever again.” This is not the reaction I need.
hopefully in other chapters I won’t find myself falling down a rabbit hole of profound distastefulness like this. (ha.)
But … jesus fuck, Yud.
How did you deal with your complex feelings towards Yudkowsky?