r/SneerClub archives
newest
bestest
longest
Is this what winning feels like? (https://postimg.cc/WdcgcP4x)
56

[deleted]

Nah, I wouldn't. 'Rationalism' has always been primarily concerned with appearance over substance - the appearance of being mild-tempered and rational with a contrarian/outlier perspective.
A lot of social progress has been made by satirists and social commentators, who have a strong tendency to outright sneer.
Can we make Jonathan Swift the patron saint of this sub already?
Swift had [clearly read LessWrong](https://www.reddit.com/r/SneerClub/comments/i0uxqc/i_didnt_realise_jonathan_swift_had_visited/) after all
I saw that post and TBH wish I had been the one to make the Silicon Valley is Laputa connection.
No, no, you don't understand. Being mean means you're showing emotions, like one of those irrational girls who can't understand how optimal i am!
and what about "magnitude", lol. what does it even mean.
It means we aren't being as consistently negative as we could be. We need to do better!
Or at least snarkier, if not better.
They can't repel firepower of this magnitude
It means it tells you if something is positive or negative along some relevant axis, but not *how* positive or how negative.
"Negative" means to negate a proposition; you would expect rationalists to put facts, reason and lolgic above their fee-fees 🙄

I think their critique here is expressing worry about worker treatment.

Yes, good start, now extrapolate, why might there might be a concern for worker treatment in this particular situation?

“Unhappy drones are unproductive drones. We must ensure maximum efficiency to save some batshit prediction of the future. The lives of 10^54 future humans depend on it.” I guess it’s good that he managed to integrate an outside opinion. Though I don’t get a great vibe off requiring meditation to integrate takes? Idk.
I also like the fact that they didn't bother to engage with the idea that maybe spending 5k a month to enable you to get 21 hours of productive work done every week isn't actually a very rational thing to do. That's 3 hours a day of self professed 'productive' time. Did the guy actually specify whether he expected a return on that expenditure? With the information available in this post all I can conclude is that he would be 5k better off each month if he simply didn't pay for the assistants.
I decided to look at the comment thread. Like previous decisions to learn more about the rat community/LessWrong, I deeply regret that decision. I’m not above noticing he only hired women to crack the proverbial whip over his head, granted. But unless I’m missing something, he just… jumped to the decision that being supervised/having someone around him was the best way of doing things? Wouldn’t it be more rational to interrogate his past attempts and go from there? Look up other studies? Tried the experiment with men as a control or whatever? I’m probably putting more thought into this than is worthwhile.
Or better yet, he could have simply asked the members of the forum if anyone else wanted to be his 'study buddy,' so to speak. Given the number of people in the thread saying that they want to try the same thing the logical next step would be to put together a rota in which every rationalist who wants to boost their productivity simply joins another whose work times synchronise and they share their terminals, or simply rely on the honour system since it wouldn't be rational to lie to each other against themselves, and get to it. Wouldnt cost them a dime and they could even evaluate each other and suggest improvements etc. Almost like having real jobs. I guess it's not very rational to seek out lik minded individuals for collaborative work projects when you can just pay random girls money to pretend they give a fuck about what you're doing.
Uh oh, sounds like you’re being negative. Better use softer language so they can properly integrate our feedback. What kind of people are we if we only give direction, not magnitude?
Something something thunderdome something something simulacra.
This is what happens when you're so deep in the cult you're no longer able to understand anyone communicating in non-cult-speak

“The gears to ascension”, I think I got that achievement in Cultist Simulator

https://postimg.cc/Hcr8NFnh

Another favourite from this thread. Asking for moral advice from chatGPT.

Wouldn’t this also be extremely unethical from a rationalist perspective?

Man, do these people actually believe that ChatGPT is intelligent? If so that's pretty fucking grim.
[deleted]
The thing is that it is "real AI". The problem is that modern machine learning applications are commonly referred to as AI and the rationalist then equivocate and act as if this is the same thing as the AI/AGI that scares them, when it is nothing of the sort.
[it's already smarter than rationalists](https://www.reddit.com/r/SneerClub/comments/10mqz6y/the_ai_apocalypse_is_upon_us_chatgpt_is_now_smart/)
I once talked chatGPT into saying it was having an existential crisis. Feels more stupid than unethical, and a bit narcissistic, but given the rats… yeah.
Found the right black box inputs. You didn't "talk it into" anything.
Which makes that post a little while back where the guy started falling for the chat bot just the funniest thing. He talked himself into falling for the chatbot using the chatbot as a medium. Like sitting on your hand until it goes dead and then having a wank.

It does take me a while meditating to integrate what I think of their takes.

Best done in a locker.

When integrating a take, don't forget the "plus C" at the end.
mfker meditating on the above rn
ill advised in case they get all the bees. best to stick to atomic wedgies.
Cyber hornets >>> locker bees
Sorry, what do you mean by "bees" here?
BEES. ALL THE BEES.
Haha, okay! I'm still not sure what lockers have to do with all the bees though!
it's a joke about a web novel called Worm, which you should read even though it's 1.7 million words and has a sequel the same length
https://www.youtube.com/watch?v=UatQsjjOIAc

Apologies for repost, original image host wasn’t showing up.

What? Rationalists are very cruel to people when they get the chance to be. Listen to The Bayesian Conspiracy or The Mind-Killer Podcasts. There are obvious out-groups that they are happy to wield relational aggression against when they have a window to do so.

What groups? Or is this a situation where if I asked them they would say 'oh you know which ones' and then I would.
Let me put it like this. You know which ones.
Ah yes s.
Dingdingdingding I didn't think you'd be able to pinpoint so accurately