r/SneerClub archives
newest
bestest
longest
29

Weekly Anything But Culture War Thread

(idea shamelessly stolen from u/ValarPatchouli)

This is an experiment in seeing how much of rationalism is salvageable. It is not a thread for waging the culture war or for sneering at what other rationalists are doing elsewhere. We’ve got the rest of the subreddit for that.

I’ve gone ahead and seeded a few top level discussion starters, but feel free to add. I would politely request, though, that we avoid overtly political and current events for the time being.

Not that this thread isn’t a good idea, but I don’t know how much can come out of this: very little of rationality, particularly the instrumental rationality improve-your-life stuff that Yudkowsky ostensibly started with, is salvageable.

There’s a bit in Thinking Fast and Slow when Kahneman tells a story about when he went to an investment bank or hedge fund or whatever, and gave a talk proving that “None of your investors can beat the market. Anytime it looks like anyone can, it’s a statistical artifact and they always regress to the mean the year after.” He said after the talk was done, the bankers nodded politely, admitted that everything he said was true, and then he left and never heard from them again. Kahneman says he has no doubt that they believed him on some level, but the implications of what he said (basically, that this entire endeavor is a waste of time) were so big that they had no choice but to just pretend none of it ever happened and carry on doing exactly what they were doing.

The rationalist crowd is a lot like that re: instrumental rationality (which is deeply ironic, considering that Kahneman is the John the Baptist to Eliezer’s Jesus). Better rationality is probably not going to improve your life in any way; the major predictors of success at “achieving your goals” are almost certainly, socioeconomic status of parents, how charismatic/manipulative you are, sheer dumb luck, degree of education completed, and level of executive function; probably in that order or close to it. In everyday life, rationality is at best sixth, and probably actually has a slightly negative effect. And they know this! Scott pointed it out years ago. But just like the Kahneman talk, that knowledge was shunted under the rug because admitting it would mean admitting the whole thing was pointless.

But the truth seeped in anyway. CFAR has given up on instrumental rationality and pivoted completely to AI risk, Eliezer doesn’t seem to talk about it much anymore (although I don’t read his Facebook stuff, so maybe he’s still trying there, idk), and even outside the Culture War threads, all /r/ssc ever wants to talk about is politics and other people. The remaining attempts at self-improvement (e.g. Dragon Army) have nothing to do with rationality and are instead cargo-culting things that Bay Area nerds see successful people doing (fascism! the military! etc.). This is all a tacit admission that the original mission of “achieving your goals with rationality” has failed completely.

I brought up this quote before in one of the other Dragon Army threads, but I’ll mention it again: “Fanaticism consists in redoubling your effort when you have forgotten your aim.”, and that’s exactly what happened to the rationalists. Once self-improvement instrumental rationality was hopeless and there was no goal, it quickly sunk into the anti-SJ circlejerk it is now. (At least the online portion did. I’m told that the Bay Area IRL rationalist/poly social scene is a bit better and hasn’t totally descended into right-wing nuttery, but I don’t live there and can’t comment)

The community did have some problems and mistaken assumptions right from the beginning that helped lead into its current state (its hatred of lived experience as empirical evidence and love of contrarianism-for-its-own-sake both contributed) but I don’t totally agree with the claims that it was evil right from the word go. I think the biggest single factor in the shitheap the movement is now is more the aimlessness once the original goal was no longer viable but no one could admit it. And since, like I said above, instrumental rationality won’t help you much, trying to get the good parts of it back is just setting up for this to happen all over again.

There is a small domain in which better rationality might help: if you’re a scientist or researcher of some variety, please do read all the stuff about how thinking can go wrong, and then all the stuff about pitfalls in statistics and research that have made so many published studies worthless. Rationality will help you out a lot. But for working schmucks like the rest of us, there’s just nothing here but the insight porn.

I must admit, high\-quality insight porn is one of my favorite things, it led me to old Less Wrong in the first place and it is what I miss a lot. Having said that, thank you for this post. It makes me think harder about what it is exactly that I gained from rationality.
I can absolutely see your points here. However: >the major predictors of success at "achieving your goals" are almost certainly, socioeconomic status of parents, how charismatic/manipulative you are, sheer dumb luck, degree of education completed, and level of executive function Let's look beyond self-improvement and assume for a minute that I'm all sorted out on the "achieving my goals" front, and I've reached the point where I can actually consider doing good for strangers Wouldn't that be where the general ideas of this rationalism stuff - evidence, counteracting your own cognitive biases etc and finding some measure of objective truth - would come in and help me decide where to direct effort? Y'know, like effective altruism but not shit.
Yes, but not by much. While I'm a big believer in Effective Altruism, I think it suffers tremendously diminishing returns, and that once you've gotten past the obvious pitfalls of really shitty charities and feel-good causes that don't actually help anything, all the additional effort you put in don't make your dollars go much further. And actually, I'm pretty convinced now that all the overthinking of EA has made its marginal utility go negative and so they're worse off then if they had just filtered out obviously bad charities and called it a day. All this crap about AI risk and electron suffering- it's certainly not my biggest gripe with rationalists, since at worst it's a harmless waste of time, but it doesn't speak well to the theory of rationality improving the effectiveness of your charity. It's more like rationality leads one down into the crazier corners of utilitarianism, where you end up with nutty conclusions like seeing if electrons can feel pain.
I mean if that's what you want to do, why not read some history books, or things on urban planning, or even anthropology? Get informed first about what people might need help with.
But isn't that very much the original rationalist idea of "let's figure out what's ACTUALLY going on" - insight porn, if you will?
Reading actual books in depth is to rationalism as a healthy loving relationship with knowledge is to insight porn. They are related, but not the same. If rationalism is only "let's find the truth," I dunno why this sub exists. There's a very specific aesthetic that rationalists have that's repellent. Many of the mistakes they make can be found elsewhere, even in academia.
Real rationalism has never been tried. (so sorry, couldn't resist) Anyway, Thinking Fast and Slow (+ a generous amount of Gigerenzer who makes a good case for System 1) is a good example of the kind of rationalism-related stuff that I think might be useful to get a clearer picture of reality.
If you've read TFS, then you'll recall the many times Kahneman tried to compensate for his own irrationality and failed. Like when he and his colleagues were trying to write that textbook, they consistently underestimated how long it would take. Anyway I think you're still too focused on improving rationality in and as itself, when a good 80% of helping people is probably empirical knowledge of the problem itself.
You're almost certainly right - I just feel there's an overwhelming amount of problems that need to be tackled, and it's so easy to waste your resources and effort on something that will have little to no effect, and I need better tools to figure out where to put my *lever and a place to stand*.
Welcome to activism. I always make myself feel better by thinking, wow, if MLK and Malcolm X and countless others laid down their lives for what turned out to be very little progress, I don't need to beat myself up for getting nothing done. Just pick something and go. Pretty much every single problem you can think of has too little attention paid to it.
On the other hand, you've got stuff like the incredible success of the LGBT movement as a whole. Not familiar enough with other countries, but in Germany, they managed to go from "off to the camps with you deviants" to "technically outlawed but not really" to.... openly gay artists, musicians and performers, then openly gay conservative politicians, openly gay supreme court judges and finally full gay marriage with equal rights - with even the hardcore conservatives deliberately foregoing the legal remedy they technically had. In such an incredibly short amount of time, not even three generations to alleviate centuries of injustice. I recently talked with a few gay mates of mine who told me they never once felt discriminated against based on their sexuality and was genuinely taken aback. I need to look into this a lot more, because there's something that worked out rather well.
It takes a lot of time to become an overnight success. Centuries, in this particular case, as you said. Much as I don't like capitalism, its corrosive effect on many social relations probably helped, too. edit: sorry, I don't mean to be a downer. It's just that this stuff is hard, and you can burn out if you don't have realistic expectations. You have to hope like a motherfucker, but balancing that with the reality of what's likely to happen is tough.
[deleted]
I started typing out a rant on the Weimar Republic and then deleted it because it's an incredibly weird mess and nobody needs Rationalisty walls of incoherent text. Consider, however: As quickly as the populace was turned against the deviant perverts, as quickly they were turned back towards tolerance - that's why I chose that timeframe, from "absolutely worst" to "pretty good". In Europe, I think, only the Dutch were faster to accept the general concept of gays just being people.
What that says to me is that LGBT gains are transient, and their rights are at risk as soon as the political climate sours again.
That's why you gotta solidify those gains and etch 'em into as much stone as you can. People hate losing freedom, so if you make it clear what's at stake, maybe it gets less cyclical. The key is that authoritarians always fuck with LGBT+ rights, so they can never, ever be trusted. Hitler did it, Stalin did it, Trump's doing it.
my god, i am so stealing this for the book
Oh you were the one who was going to write about about this? Could you pretty please go to a rationalist meetup and report back?
Did enough of those in 2011. Never went back after the impassioned advocacy for the importance of race and IQ theories.

It is not a thread for waging the culture war or for sneering at what other rationalists are doing elsewhere.

You can’t tell me what to do

What the hell do you mean when you refer to “rationalism”?

I'm starting from [here](http://www.reddit.com/r/SneerClub/comments/8n3iv7/-/dzsm3qf), personally. It seems to be a common thread among a bunch of the regulars here that we were all initially attracted to LessWrong or SSC because we had some initial idea of what they could or should be that reality failed to live up to. I'm done with most of the rationalsphere as it currently exists, but I still really like the ostensible goals. Consider this a trial balloon to try and find out if there's actually a baby underneath all they bathwater.
> among a bunch of the regulars here that we were all initially attracted to LessWrong or SSC because we had some initial idea of what they could or should be that reality failed to live up to Sure. And what is that? I had a classical philosophical education before I encountered internet "rationalism" and immediately found it ahistorical, vague, and elitist without merit. At its most sensible, it appears to a general interest in overcoming one's own bias and intellectual concerns in general, as if that needed an "-ism."
I agree with you almost entirely, save that I do see some value in attaching an -ism, so to speak, to the process of consciously and intentionally developing better habits of mind. Maybe not a philosophy and definitely not another cult of personality, but it would be nice to have a flag that interested amateurs could rally to. For good or ill I ended up with an engineering degree. My interest in cognition comes at the end of my formal education. More importantly, perhaps, it's way over on the practical end of the theoretical vs practical spectrum. I want to think better and I'm willing to study some philosophy as a means to that end, but not as an end in itself. Maybe this is a pipe dream, but I can't help but hope they there's some space between complete layman and actual expert that people like myself could occupy.
Don't take this the wrong way but it's genuinely nice to see a thoughtful engineer genuinely (that's "genuinely", not "I read a short intro to stoicism and felt manly") interested in philosophy
> I agree with you almost entirely, save that I do see some value in attaching an -ism, so to speak, to the process of consciously and intentionally developing better habits of mind. Maybe not a philosophy and definitely not another cult of personality, but it would be nice to have a flag that interested amateurs could rally to. I don't really see the need for a flag to rally to in just the same way that people who are interested in personal fitness and developing better dietary habits don't seem to need an "-ism." I think people have this view that "-ism" lends a kind of intellectual credibility, the vaneer of a systematic worldview or something. On the contrary, "-ism" terms are very frequently quite the opposite of that, a hideous conflation of propositions sharing a superficial family resemblance. At worst, especially when applied to things as semantically nebulous as tendencies and attitudes, "-ism" treatment invites a whole host of confusions (category mistakes, false dichotomies and oppositions) that grammatical possibilities afford it. There are occasions in which "-ism" terms are organically useful, i.e. identifies or picks out an intelligible view or class of views in some intellectual space afforded by a subject matter, but used outside of that specific condition, I'm sneering hard, fam.
> people who are interested in personal fitness and developing better dietary habits don't seem to need an "-ism." Right, but nevertheless people do create communities and mutual help groups, like /r/fitness or bodybuilder forums. Don't get me wrong, I have a lot of scepticism about creating groups for the sake of creating groups. Most study and self-improvement can and should be done on your own, and “I need to collaborate with other people to succeed in X” frequently becomes a form of procrastination. But communities around a self-help idea X appear inevitably because they are a good source of motivation, accountability, and outside judgment. They are also unavoidable for accumulating and structuring knowledge. Even identities are common, people do identify as ‘bodybuilders’, ‘powerlifters’ and so on.
Yeah. I don't have any issue with people sharing an interest or even an communal identity by that shared interest in, as _vec_ describes, developing better habits of mind. But "rationalism" isn't a rose by any other name. In fact, there's [already another rose by that name](https://en.wikipedia.org/wiki/Rationalism). And it's not clear that this the former rose is a rose at all but maybe the act of rose-gardening.
> I agree with you almost entirely, save that I do see some value in attaching an -ism, so to speak, to the process of consciously and intentionally developing better habits of mind. Maybe not a philosophy and definitely not another cult of personality, but it would be nice to have a flag that interested amateurs could rally to. Can't see how it won't turn into one, sadly.
I'm gonna second /u/shitgenstein Everything it seems I could have learned from rationalism I learned already, and significantly better, in my Philosophy and (shock horror) English BA
I have a BA in Philosophy and don't feel that rationalism is covered well by it at all. Undergraduate Philosophy isn't a training in overcoming bias, understanding and discussing social stats, or a self-help community for intellectual development.
I guess it depends where you went, because I covered all of those things and more
What kind of community does RationalWiki have? Don't they have similar desiderata?
beats me

Meta

Is this a good idea? Should this be part of SneerClub or a stand alone subreddit? What should the rules be?

Let's do it, I think the idea is worthwhile. Also, having interesting, civilized arguments would, in a way, be the ULTIMATE SNEER.
This sounds like entryism to me.
honestly I'm not sold on the idea that what people usually think of as rationalism is at all worthwhile and isn't just obfuscated orthodox christian teleology justified with cargo cult mathematics. on a concrete level what are some examples of what types of discussions this thread is intended for?
I think it is a good idea, but not for the sake of Rationalism as a movement or capital-T truth. I think it's a good idea because there are a lot of smart people on the left who frequent this sub and who used to frequent rationality blogs who want a fucking community of smart people to talk about interesting things with. I think that was what drew me to SSC in the first place, I was like, wow! Smart people are talking about smart things in a general way such that a generalist like myself can both grok and contribute. I really like the idea of a place where several people can all synthesize their amassed knowledge of history, literature, politics geo-and-domestic, art, tech, etc, and let me see for a moment the world through their lens through my lens, and have interesting discussions and get new reading recommendations. I see details I wouldn't have otherwise, it gives me ideas, which I love for their own sake even if I don't put them to some higher Rational use, and I get to marvel at how cool some people the world creates are, instead of being disappointed and perturbed at what I usually see in the rationalist community-- asshole nerds waving their tiny dicks at each other.
About hosting the experiment on SneerClub: Has this been approved by any mod? It is, after all, hijacking someone's sub. I have a feeling, however, that, long\-term, rationalism is not compatible with the Sneer \(talking about a weekly thread, for example\). A part of the beauty of rationalism for me is treating ideas as worth serious consideration and maybe a bit of overthinking. SneerClub is not a serious place: it's a circle of play where the rules are skewed towards mockery and that serves its function. Still, what you're getting is data. Thank you for being proactive, \_vec\_. About rules: The concept of prototyping is about respecting human limitations. I think rules could be established \(and changed\) after someone, having enough positive signals, feels it is an idea worth investing in. Then again, this IS SneerClub. So who is willing to invest in rules? Who is daring enough to admit to caring for a community and rationality that much on SneerClub \(and probably invite the mockery of some rationalists and ex\-rationalists\)? Whoever they are \(wink, wink\), they might want time to test the idea. Therefore, about a separate subreddit: Separate subreddit would be a safer experiment, but it would also take more people to not be too demanding and painful, I'm guessing. Having said that: what's the worst that could happen? That it will not be used? That it will be bad? All survivable.
I think a weekly public policy thread would be a great imporvement to ssc, but it doesn't really align with the core mission of sneerclub.
I wouldn't mind seeing a subreddit that was more about posting good articles. Like that thread a few days ago where people were looking for things to read in addition to Nathan J. Robinson, or that book recommendation thread on what to give to rationalists who need some mental readjustments.
I really don't want to see /r/sneerclub, a subreddit I've been checking for years, get any more taken over as a Rationalists Anonymous get-together. Y'all ex-rationalists are great and such but you still need to learn the real essence of the sneer. Also for God's sake will somebody post something that isn't SSC?
Here here. If we continue down this path, I fear that we may inadvertently adopt *discussion norms* around principles like *charity* and *steelmanning*. The whole point of sneering is to take a break from such things!
Agreed. I came here to be saved from ponderously earnest discussions.
[deleted]
Goddamnit Jim I'm a philosopher not a programmer
[deleted]
Gender Neutral Dude if I wanted to do applied philosophy I'd have become a motorcycle maintenancer
I think this should be a stand alone subreddit. However, for these initial trial threads, I think hosting them in sneerclub is good. It'll provide more beta testers.
[deleted]
Here's a sneak peek of /r/askphilosophy using the [top posts](https://np.reddit.com/r/askphilosophy/top/?sort=top&t=year) of the year! \#1: [How to deal with unproductive gadflies like followers of Stephen Molyneux, Ben Shapiro, and Jordan Peterson?](https://np.reddit.com/r/askphilosophy/comments/77hda6/how_to_deal_with_unproductive_gadflies_like/) \#2: [Why people assume they are smarter than philosophers?](https://np.reddit.com/r/askphilosophy/comments/7nv5b4/why_people_assume_they_are_smarter_than/) \#3: [If people question the morality of bringing sentient AI into existence, why don't people talk about the morality of bring a child into existence? Is it not the same thing, just without the wires?](https://np.reddit.com/r/askphilosophy/comments/8h4bq5/if_people_question_the_morality_of_bringing/) ---- ^^I'm ^^a ^^bot, ^^beep ^^boop ^^| ^^Downvote ^^to ^^remove ^^| [^^Contact ^^me](https://www.reddit.com/message/compose/?to=sneakpeekbot) ^^| [^^Info](https://np.reddit.com/r/sneakpeekbot/) ^^| [^^Opt-out](https://np.reddit.com/r/sneakpeekbot/comments/7o7jnj/blacklist/)
I think it's a good experiment. And as long as it's few threads that don't overtake the rest of the subreddit, I prefer it as a part of here: When trying to move establish a new subreddit, you always have huge attrition (not everyone overcomes the trivial inconvenience of subscribing the new subreddit, even if they would be interested in the thread). And closing the loop from sneerer to sneeree inside a thread is a pretty cool sanity check. Two negative considerations: First, the non-serious sneer-place needs to persist. Mixing in this kind of thread looks like a cool idea, but most other threads must remain funny and mean. Second, I think quite a few people got perma-banned from the sneerclub (e.g. /u/PM_ME_UR_OBSIDIAN) who should probably be welcome in this kind of thread. That is, serious discussions of pancake recipes want a less permanent banhammer than simple sneering at bad pancakes.

Pancakes: too much work? Is the blessing worth it? They never come out right!

Pancakes are fantastic when you've got them figured out. And until you do, you can just chop up your failed experiments and make pancake soup. Truly the Rational man's food.
It's the figuring out that's the hard part. Pancakes aren't difficult, but they can be intimidating when I'm half asleep. I like to do breakfast food for dinner every month or two to try out new recipes in a more lucid setting first.
I got a small blender (NutriBullet type thing) for smoothies because I love fruit almost as much as I hate chewing™. Imagine my surprise when my ex used it to make pancakes --- *blenders can be used for more than liquifying fruits*. I've been in a habit of skipping breakfast for a while now (bad but still), but after figuring out the batter was easy because of the blender they always seemed super easy.
Wheat batter forms gluten when beaten vigorously, which is not generally desirable for pancakes. Batter by blending is not recommended, especially for someone who hates chewing. A Truly Rational Ubermensch folds the mixture until just combined, slightly lumpy, then lets it sit for a few minutes so the rest of the flour absorbs the liquid. This will result in optimal pancake fluffiness.
Once your pancake technique is mastered, you can easily batch them.
Pancakes lead to diabetes. Check these 50 clinical studies I've skimmed the abstracts of.
misread the abstracts of
Are you making the batter too? Then too much work. But someday in the future we'll be able to [train robots](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.304.3893&rep=rep1&type=pdf#page=146) to [make pancakes](https://ieeexplore.ieee.org/abstract/document/6100855/?reload=true) and this question will be moot. Edit: In the mean time [you can use math.](https://www.sheffield.ac.uk/news/nr/secret-perfect-pancake-discovered-1.358297)
**The Cake Sequence** Using a suitable DEVICE such as a yolk-seperator or the Yolkine Seperatus Technique mentioned in the Avine Reproduction Sequence. That is to say you separate the two parts of the egg, you may argue that the shell is part of the egg, but here we are strictly talking about the EDIBLE parts of the egg. You may ask why we define the egg-shell as inedible, but that is for another sequence. Wham-a-bam the yolks till they appear creamish, at which point apply to the mixture two teacupfuls of extra-fine monosaccharide crystals to create a Vitellian Monosaccharide mixture. Here it may be worthwhile noticing that at this point the whites will be in a separate container and that you are not to mix them and the yolks at this stage! See the Mixture Assumption Error sequence. Wham-a-bam the resulting Vitellian Monosaccharide mixture for five to ten minutes while maintaining steady supervision of the mixture to ensure that you reach optimal results. Then add two tablespoonfuls of milk or water, a measure of salt sufficient for this cake. It is worth noticing that despite this being a sweet sponge cake the salt is necessary as a flavour enhancer, much in the same way as MSG may be added to say Chinese take-out foods. The salt is not meant to provide flavour in itself, but, as I said, to enhance the flavour of other ingredients. This is a well known technique, but given how counter-intuitive it is it is a technique that is often either ignored by inexperienced Pastry Creationists or else done entirely by rote without fully understanding the underlying principles. This is why you will also add some flavouring at this stage. Now add a fraction of the albumen, which you should have wham-a-bammed as well. Then add two cups of flour into which you have sifted two teaspoonfuls of baking powder; It is important to understand that the gas-development of the baking-powder is what helps turn this cake into a sponge. As the baking-powder is heated it releases vapours which creates many hollows in the body of the cake. Take the resulting mixture and slowly mix it into the Enhanced Vitellian Monosaccharide mixture, ensure that the mixture speed is the minimum necessary to combine the two ingredients. To conclude mix in the remainder of the albumen. Line the baking containers with buttered paper. That is to say paper onto which butter has been applied to ensure that it will come loose easily when the baking process if over. Then fill the containers two-thirds full. ([a sneer posted to another site](https://forums.spacebattles.com/posts/11402074/))

:D.

About design thinking: the first step is usually to talk to people and ask what they want.

The second is to ignore half of what they’re saying and get to nuggets of truth by observing them as well as you can.

It seems to me, however, that gauging the feel of SneerClub is a perfectly doable thing, that it counts as observing and that it has already been done to a degree. So let’s retrace and ask the unified questions; the answers might not lead straight-forwardly towards a solution, but they are always super illuminating when it comes to recognizing actual needs.

(I’m stopping now.)

So, it would be helpful to see short-ish answers to:

What did you like rationality for?

What did you dislike rationality for? (briefly: since we’re in SneerClub, it’s probably easy to get carried away)

What would you like in a new site?

Things I liked rationality for: **The sense of mission:** Create an AI and defeat death. Seems simpler than building conditions for humans to create a just world and serves as a great simple heuristic for everyday life. \(Honestly, I don't think this one's salvageable. Although it was fun. I'm back on vaguely optimistic and hopefully reactive train when it comes to these things.\) **The community:** Since the fantasy nerds turned out to be generally unpleasant to hang out with after my hormones calmed down a little, I started talking to different people and it was good for me. Still, finding a nerd nest full of older people and feeling the inferential distance shortening was very nice. So, as banal as it sounds, a bunch of people caring about the same things \- how to be good when you don't know much \- with a shared cultural experience was a part of the draw. **The language:** The rationalist language got a lot of sneer in here for good reason. I still like it. It calms me down. I have a history of reacting that way to writing. First I read a lot of journalism \(I studied journalism, so we're talking A LOT\). I got tired of the inflammatory and simplistic language. So I started reading scientific papers and it felt amazing, simply because the language was slow and less inflammatory. Discovering Less Wrong was like that, only for the Internet writing. I want more. Granted, it was at a time when I was more sensitive to writing in general. Now I go to ar/braincels to have a laugh, so. \-\-\- Briefly about what I disliked: **Sexism and racism:** I feel comfortable with what I assume are modern language norms: trying one's best to not use racist or sexist language and concepts that could make someone feel viscerally bad. Honestly, when it comes to reducing bias, habitually controlling your language seems to be such a low\-hanging fruit that the pushback from people familiar with Kahneman is beyond me. So, the acceptance of racist and sexist language and concepts is what I don't like. \-\-\- Things I'd like on a new subreddit/site: **To have people comply with** **ruthless** **SJW** **standards of speech.** What does it mean? It means upping the difficulty: the posts need to be not only high\-effort, non\-inflammatory and informative, but also formulated in a manner that minimizes the chance of the reader feeling any kind of discomfort as much as possible. This is difficult for some people and the right methods of doing that are negotiated, so: will be fun. For myself: I know that if sexist or racist jokes will be entertained in the new subreddit, for example, I will not participate. Just like that. I need the place to be reeeaallly, reeeaallly nice to put in the effort. **To not have the new site focused on debate.** Debate is one way of learning, but I feel like its application was a bit indiscriminate in the rationalist community. What I actually want to read about is the lived experience of ex\-rationalists. It is a weird feeling to be out for me and I'd like to see how it affects others. So: more emphasis on describing one's doubts and experiments and self\-improvement attempts. Or, to be even clearer: I'd like to write posts like that somewhere without being accussed of not sneering enough \(:P\) or having to establish my own tumblr. **To preserve the language.** I want the relative tranquility of a pop\-sciency book in an Internet community. **To be fun.** This one's important. There are probably better ways of doing good than talking about biases and such. But talking about biases and such is nice. It makes me feel briefly in control, for example. I think I might care more about the fun and the brief feeling of control than about good here, because I do good on many other fronts in my life \(I hope\), but there aren't many places where I can discuss TFS. So, to be precise: if the sub turns out to be fun to participate in, but will not help me achieve my goals or be better in a significant, measurable way, I will not call it a failure.

well, “is scientific racism necessary to be truly rational?” is culture war-ish.

so instead, let’s ask: “is thinking lots and lots about torture as the Bad End necessary to be truly rational?”

so you know I wrote a book about why bitcoin is stupid, right. Well, it’s sold remarkably well - and is the first Createspace self-published book ever to get into the NY Review of Books! - but it’s been nearly a year, and sales are slowing down. So I really need to haul arse on the next book.

So that’ll be two books. One will be another bitcoin/blockchain book, because I have somehow got myself a second part-time job as a finance journalist of sorts. (People give me money for this, speaking gigs, consulting.)

The other will be the art project that’s nagged at me for years, and which Elizabeth Sandifer has been nagging me to do for years as well, particularly since I helped work on her Neoreaction a Basilisk: working title Roko’s Basilisk.

You might think this has zero sales potential. But! Tom Chivers - who is a really good science journalist, of the sort who you see his byline and think “this will be good and not suck” - has a contract for a book about the rationalists! (And yes, I’ve been feeding him research material and links. We figure they should help sell each other - one book is an oddity, but two books is a movement.)

[CONTENT WARNING: excessive fondness for TORTURE TROPES]

So I spent eight months writing 0, because I had no idea how to approach this. But I finally sat down and wrote 880 words of an intro, adapted from a Tumblr rant on transhumanists. And this weekend I wrote 320 words of an alternate intro … and started on the chapter handling Yudkowsky’s predilection for torture in his thought experiments.

Like, real life has no-win situations where some number greater than zero people are gonna suffer or even die, and it’s your decision - say, you’re a doctor or a politician: your job will involve this. So there’s serious ethical questions here, and philosophers working on them seriously, and their work does actually inform the decision makers.

The philosophers’ usual end point in matters of life or death is life or death, though. They don’t have the urge to … turn it up to eleven.

So there’s the notorious “Torture vs. Dust Specks”. But, even though the original Roko post positing the basilisk doesn’t use the word “torture”, the commenters, including Roko, went straight to that word to describe the Bad End. And it’s in other posts, and other people’s posts on LessWrong. It’s in the subculture’s collection of local tropes. Talking about torture is just normal for fearless idea-exploring Philosophy Tough Guys.

The trouble with starting a project like this is that you have to do the research. So I found this on Amazon for 99p. so of course I hit libgen

Dark Lord’s Answer is something Yudkowsky wrote before A Girl Corrupted. He notes in the outro that this helped him get it together to write that work, and he has no idea if it’s any good but hey might as well put it out!

The story is a didactic economics parable in novella form, on the value of monetary policy rather than a rigid gold standard - “So you see, Prince, that you’re not being told to steal from your country of Santal. Even if, to save it, you must transgress the righteous rules against usury and adulterated coinage.”

So far so good. You know this literary form. Gets dull at novel length, but you can get away with it perfectly well as a novella.

But the blurb warns: “Content warning: sexual abuse, economics.”

A masochistic slave girl is the author mouthpiece, speaking in economics lecture notes - and several of the ethical dilemmas concern physical and sexual abuse of her. (Though none of it on-screen, thankfully. Though the loved one points out that then drags the reader in - as you’re put in the position of imagining it for yourself.)

The Prince is disparaged as too virtue-oriented and insufficiently consequential in his ethics to save his country’s economy, based on his responses to these ethical dilemmas - e.g., not taking up the offer of abusing her.

But it’s OK, ’cos she’s consenting!! (This is an actual excuse a rationalist made when I and others pointed out that this work’s sexual psyche has a number of issues, e.g. the misogynistic overtones. At which point he demanded explanation of what could be misogynistic about it.)

It reads like a kinked-up version of the Sequences. Gratuitous sadomasochism for flavouring, and to provide Yudkowsky’s favoured style of Philosophy Tough Guy ethics test. Something edgy.

(At least the economics is reasonably normal stuff - though that makes the torture porn more jarring, not less.)

While you can say “ehh, his kink is his kink” and that’s probably fine amongst consenting adults … you’d have to be as tone-deaf as a rationalist not to have a fucking shitload of red flags spring up at this particular fearless exploration of ideas - and how fearless exploration of this particular idea just keeps showing up in his nonfiction as well.

So not the greatest or most charming work. Two stars out of five, ’cos it’s grammatically accurate and spelt properly, the plot is coherent and makes sense and it’s not quite one-star bad.

But I can’t just pretend it doesn’t exist, when I’m talking about “Torture vs. Dust Specks”, and I can’t not talk about that essay.

So my problem is now: how the arsing fuck do I write this, without repelling any reasonable human from wanting to read it. I ran the chapter past the loved one and she said “It’s well written, but I don’t want to read about any of these people ever again.” This is not the reaction I need.

hopefully in other chapters I won’t find myself falling down a rabbit hole of profound distastefulness like this. (ha.)

But … jesus fuck, Yud.

I looked at that torture essay you linked, and found this comment: >If anything is aggregating nonlinearly it should be the 50 years of torture, to which one person has the opportunity to acclimate; there is no individual acclimatization to the dust specks because each dust speck occurs to a different person. Scary how little the big guy knows of how humans work. I mean, had he seriously not heard of PTSD at the time he was writing this essay? > So my problem is now: how the arsing fuck do I write this, without repelling any reasonable human from wanting to read it. Hm, that's a real problem. I was thinking, true crime can get pretty distasteful, but there's something pathetic about the rationalist community that leaves it stuck on 'distasteful' and never elevates it to 'sensationalist.'
The easy crib for refuting the Sequences is looking at the comments from subject-matter experts going "what is this trash". Plenty on that essay.
So do you have to go into a lot of depth on the kinky sex-slave novella? I feel like mentioning it exists and throwing in a few jokes is good enough. Like, do you really need to describe it in more detail than you've put in your OP?
Nah, not really. Mostly that a pain shared is a pain doubled. But I got the reader revolt at Torture vs. Dust Specks.
You're a brave soul for diving so deeply into this. My sympathies go with you.
> So my problem is now: how the arsing fuck do I write this, without repelling any reasonable human from wanting to read it. I ran the chapter past the loved one and she said "It's well written, but I don't want to read about any of these people ever again." This is not the reaction I need. Just make sure that you combine righteous indignation with a certain amount of levity to help the medicine go down, and you'll probably be fine? This sounds like an infinitely dunkable piece of work, and you sound like the person with the exact tools to really skewer it proper.
>But I can't just pretend it doesn't exist, when I'm talking about "Torture vs. Dust Specks", and I can't not talk about that essay. >So my problem is now: how the arsing fuck do I write this, without repelling any reasonable human from wanting to read it. I ran the chapter past the loved one and she said "It's well written, but I don't want to read about any of these people ever again." I think I see the problem here: you're [A-Logging](http://nymag.com/selectall/2016/07/kiwi-farms-the-webs-biggest-community-of-stalkers.html) (ctrl-f "A-Log"). If you're looking to trigger your audience's disgust reaction, a little goes a long way, so you don't want to dwell on it, you want to use a very light touch. E.g. you could make a brief, offhand remark about the popularity of BDSM kink among transhumanists early in the book, and then make a running gag of lampshading every recurrence of the topic of torture in seemingly unrelated contexts. That should be more than enough for normie readers to fill in the blanks. Although I must say, I strongly disapprove of trying to discredit people by portraying them as disgusting sexual deviants (or people who are too socially oblivious to realize that they're likely to be perceived as disgusting sexual deviants and this lowers their credibility, not with us I mean, we're too tolerant and enlightened and open-minded of course, but you know ...). One of the things I like most about the rationalsphere is their dogged insistence that if you want to claim to have refuted a proposition, you must actually engage with the substance of the proposition, no matter how many crackpot red flags the person proposing it happens to be waving. (Although of course you're allowed to use heuristics to decide that the expected utility of debating someone is negative and ignore them. But you should be honest when that's all you're doing.) Anyway, if you just want to do a study of rationalism or transhumanism as social phenomena, I don't have a problem with using a morally judgmental tone about abusive, manipulative, or dishonest behavior, or the promotion of harmful political ideologies, as long as you bring receipts. It's kind of dickish to come down too hard on sincere, well-meaning people who are merely deluded, when their delusions are mostly harmless to others, but gentle mockery is not out of line. But if you want to real-talk about the nexus of transhumanism and psychological eccentricity, I don't see how you can do such a complex subject justice without getting into the whole quagmire of neurodiversity, social impairment, gender nonconformity, paraphilia, etc., etc., etc. That could be interesting, if you approach it in a dry, clinical way, or a compassionate, Social Justicey way, but if you just want to channel your reflexive contempt for these people ... then your mean-spirited ableist attitude is bad and you should feel bad. In my not so humble opinion.
You're being a weird idiot (in the same manner as the rationalist I mention therein - one would almost think that in both cases, your *actual* objection was that your favourite thing had been maligned in some way). Would it be clearer to you - not hypothetical non-rationalist readers - if I pointed out he was doing something not identical to, but very like, scening in public? Thought experiments in ethics aren't the place for that.
>You're being a weird idiot (in the same manner as the rationalist I mention therein - one would almost think that in both cases, your actual objection was that your favourite thing had been maligned in some way). You're certainly not the first person to call me weird, but I'm pretty sure I'm way less Yudkowsky-style-weird than you're assuming. Although it's understandable that you would infer that. It doesn't really matter. Anyway, I didn't post a (qualified) defense of him because he or his movement especially represent any of my favorite things; I did it because the manner in which you criticize him is one of my least favorite things. I don't know why my intuitions on this are so far from the ethos that gave rise to this sub. I can certainly detect a trace of that disgust reaction in myself when I see people like Yudkowsky flaunting their weirdness, but ... I just see it as the sort of sentiment one keeps to oneself, for the sake of civility and basic human decency. I mean, there are *so many* other ways to say "This person is wrong about X, Y, and Z, and people are foolish to give him any money or respect" that don't deal gratuitous collateral emotional damage to a bunch of bystanders from a demographic that are already more likely than average to be alienated and psychologically vulnerable, and it's *so easy* to just choose not to assume the worst of people and get in fights all the time and crap up the internet with even more vitriol. (Unless someone is being a jerk to you or your ingroup, then I totally get the temptation to respond in kind. Do you have some kind of personal beef with Yud?) >Would it be clearer to you - not hypothetical non-rationalist readers - if I pointed out he was doing something not identical to, but very like, scening in public? Thought experiments in ethics aren't the place for that. Well, no, as it turns out; I had to google "scening in public," as I'd never encountered that usage before, much less the behavior it refers to. Anyway, I can't tell if "thought experiments in ethics" is referring here to the Torture vs. Dust Specks essay or *Dark Lord's Answer*, which I haven't read, because, as with all of Yud's fiction, I can tell a mile away it's not what I'm into. If the latter, so what? It's just his weird hobby. If the former, I have to say, this whole "Rationalists keep using torture in their thought experiments because they're *perverts*!" argument is reeaaaally tendentious. You talk about it as though it's so weird, but torture seems to me to be a natural thing to reach for any time you need to fill a box labeled "extreme negative utility outcome goes here" in an ethics thought experiment, and it's hardly original to Yudkowsky. (Cf. *The Ones Who Walk Away from Omelas* and, oh, I don't know, HELL.) Also, academic thought experiments in ethics is the genre that gave us "Can it ever be justified to stop a runaway trolley by pushing a fat guy in front of it?" So yeah.
> I didn't post a (qualified) defense of him because he or his movement especially represent any of my favorite things > I mean, there are so many other ways to say "This person is wrong about X, Y, and Z, and people are foolish to give him any money or respect" that don't deal gratuitous collateral emotional damage to a bunch of bystanders from a demographic that are already more likely than average to be alienated and psychologically vulnerable the second statement, which is made of cut'n'paste phrases rationalists use when people call them out on reprehensible statements or behaviours - you should see Tumblr rationalists when someone dares say libertarianism or palling up to actual fucking Nazis is bad! - belies the first. > I had to google "scening in public," tip: If you're going to get logorrheically self-righteous about something being maligned, it's more convincing if you first knew the first thing about it. So if you weren't getting into high dudgeon defending the honour of rationalists, or of kinksters - just who *were* you defending the honour of? > "Rationalists keep using torture in their thought experiments because they're perverts!" argument This is not the objection, but thanks anyway. > but torture seems to me to be a natural thing to reach for It really, really isn't. Thinking it is is a lot of the actual problem here. > Cf. The Ones Who Walk Away from Omelas This is like thinking your Harry Potter fanfic is worthy of a Hugo because Gaiman got one for *A Study In Scarlet* so therefore fanfic belongs in the Hugos. Even making that comparison is an ocean of Dunning-Kruger and not being in the same time zone as the point.

How did you deal with your complex feelings towards Yudkowsky?

I genuinely would like to know what "complex feelings towards Yudkowsky" you or anybody else could have, because mine are extremely simple and expressed mainly in swear words.
That was an example question, throught about for 30 seconds and meant as a kind of a joke. But there is something to it, so it will be reformulated.
>ValarPatchouli Wrong alt?
Ha, ha.
No seriously, I'm confused why I'm getting that reply explaining the first comment from a totally different account as if it were the original account
I copied the question from his comment when I created the thread
Oh lordy, I can't be expected to actually follow your links darling
Seriously though, if anybody does have those feelings I wanna hear about them
Incoming.
ASSUMING MY GENDER IN \_MY\_ SJW THREAD!!!??? Nice! \(Does that count as politics, and does that count as the right kind? These are the questions!\)
Oh, ok. \_vec\_ linked to it, but it might not be clear, so I will glue my post from a different thread underneath. I asked the questions about Yudkowsky and pancakes there, and I believe \_vec\_ copied them here verbatim to be consequent after having stolen the name too ;\>. "Design has steps, and we are perfectly capable of \(edit: one of them\): prototyping. So here's the perspective of a design trainee: Create a reddit. Call it "SJWs and instrumental rationality". \(Here are some ideas for more prototyping that might work or not: Ban politics for, let's say, two weeks. Establish threads like "Rational take on pancakes: too much work? Is the blessing worth it? They never come out right!" or "How did you deal with your complex feelings towards Yudkowsky?". Then unban politics for two weeks and see what that looks like.\) Ban and remove as you want it banned and removed: don't even formulate any more complex rules. The rule of good low\-fidelity prototyping is "don't work too much on that". After a month: see what you've learned. Maybe redditness will not be that much of a problem. Maybe it will. Maybe it will be scary and too much work. Maybe it won't. Maybe nobody will use it. Maybe it will give way to the warmest fuzzies of your life.  One problem I see is: putting too much strain on one person who will feel too responsible for the site. So let's say that underneath this post a person willing to help writes "lemons are great, I kinda want to help". Such a person will be willing to: 1\) write one medium to high effort post on instrumental rationality to get things started and 2\) maybe moderate. If the company is not enough to alleviate the stress of prototyping \- it is hard to do things at 40% and deal with it \- then maybe having the sub "on invite" will also help \(if there is such a thing on reddit?\). I personally have two weeks relatively off now \(first illness, then vacation\), as in: I will find a comfortable hour a day to do such a thing. But not alone. And I thought of that many a time. The thing that stopped me each time is that I am not a male Sillicon Valley programmer \(It makes sense when you're not one\). ... ok, so you're sold? Perfect! Because there is even a simpler test to try: make "SneerClub" host a few "SJWs and instrumental rationality" discussions for two weeks and see what happens. An experiment design with interesting trade\-offs... Edit: I'll keep it here, but I meant to respond to [u/yemwez](https://www.reddit.com/u/yemwez) and my design remark was planned to be clever."
But where's the sneer?
[Mine are about like this.](https://i.redd.it/h7nt4keyd7oy.jpg) His output is somewhere north of 90% crazy. The other 10%, though, I really like in spite of myself. Maybe it's because I'm coming at it from a very STEM background, but he has an eye for metaphorsc that feel accessible to me in a way that a lot of writing on philosophy really don't. To pick a somewhat random example, his article about ["bleggs" and "rubes"](https://www.lesswrong.com/posts/4FcxgdvdQP45D6Skg/disguised-queries) has no original ideas in it. I recognize the concept he's trying to present as one I've seen in a dozen different places throughout my intellectual life. But his formulation is the one that actually _clicked_ for me; that handed me the concept in a form I could manipulate in my head and comfortably apply to novel situations. I wonder if it's not Yudkowsky specifically. There's a real dearth of highly pragmatic philosophy aimed at a heavily systematizing audience. Maybe all I really want is a version of the sequences that isn't written by a crazy transhumanist with no formal education.
I'd hate fuck him.
Kinkshaming that
You'll need more words for me to understand the intent of your comment.
your kink is not ok
Even as a rationalist-adjacent person, I never liked Yudkowsky. From the get-go, I knew belief in an AI apocalypse was irrational, and I saw the obvious religious undertones in Less Wrong. Also, developing rationalism into some sort of concrete, systematic ethos rubbed me the wrong way. I'd always felt that the *essence* of rationalism was shunning grand narratives, except perhaps provisionally, because buying into a grand narrative is essentially the best way to cut off your exit routes from incorrect beliefs, which is the worst sin a rationalist can commit. Really, it's something of a tired saying, but I didn't leave the rationalist movement -- they left me. I think I would still be one today if that social ecosystem was still thriving in the way it used to. As much as I hate to admit it, the real intellectual enthusiasm and energy right now is all on the right wing, and I just can't go there.
> shunning grand narratives sounds dangerously postmodernist, you evil pomo-SJW you
Back when I "hated postmodernism", I always thought their problem was not the shunning of grand narratives, but their inability to replace them with anything. What I eventually learned was that 1. Some of them *did* try to find replacements, but not everyone agreed on what those should be, so you don't hear about them as much from low-depth expositions of postmodernism; and 2. "Postmodernism" wasn't really ever a coherent thing, and it's mostly just a snarl word nowadays.
I just got bored with him, and skeptical... simple.