r/SneerClub archives
newest
bestest
longest
Jacob knew he wanted to get serious with someone, but he found it hard to weigh the merits of each of these potential partners against each other. So he did what he knew best: he made a spreadsheet. He called it “How to Choose a Goddess”. (https://archive.is/E7h3c)
46

Whenever I tried to make life decisions rationally, I always ended up tweaking the parameters until the results agreed with my gut feel.

I never regretted it.

Well duh, rationalizing is exactly what rational people do, by definition.
A friend told me that when he can’t make a decision between two options, he’ll flip a coin. There’s one of two outcomes: either he’ll go, “ok”; or he’ll know to pick the other option.
Gut feelings are your brain telling you things you don’t (necessarily) want to consciously acknowledge. Trust your gut.
My genetics give me superiour gut feels, so i can tust my gut, and you, fellow rationalist, als can trust your gut (you would not have become a rationalist if you didnt have superiour genes), but those other people shouldnt trust their gut.
Perhaps we should experiment with different diets to see if they alter our gut feelings? I propose we hop maniacally from one diet to the next in order to ensure no conclusions can ever be drawn from this pointless activity.
This sounds like a good experiment, I will also run it at home. I read the wikipedia page on the scientific method, so you can trust my results and methods. Also, as we are each others peers, we can do reviews of each others work.
that definitely works. for example, when i dont eat at all, my gut tells me to eat. when i eat only alcohol, my gut tells me to vomit. so yeah.
Metamucil brain thoughts

“A guesstimated number is better than not using numbers at all”

This is it

That's a common mistake - the idea that quantifying stuff and giving precise numbers somehow makes it more accurate or useful. Part of the reason why this is an issue with rationalism is because they don't like to concede how much of their thinking is based on subjective thinking or personal preference (as opposed to pure reason, immaculate logic, and cold, pragmatic calculation). If they can find a way to shoehorn a number into it then they can just say that they are optimizing those numbers without having to think critically about what the numbers themselves mean or how they assigned them. In a funny way, this is actually kind of a problem in business as well as in romance. Not every metric really lends itself well to numerical scoring; sometimes, focusing on numbers means focusing on factors that are easy to turn into numbers even if they aren't important to you; and sometimes, the assignment of numbers themselves can be skewed by emotional bias to the point where you might as well not have even bothered with numbers.
I have literally had a rationalist argue to me that pulling a number out of your arse and running it through Bayes' Theorem produces superior results to just pulling a number out of your arse and using that, or indeed in any way a useful number. The concept of non-ultracrepidarianism appears alien.
Well their core tenets of LW " Bayesianism" is that anything is at least slight evidence and that slight evidences sum together more or less linearly (both of postulates are total bullshit). Near as I can tell the sole reason for those two postulates is to rationalize things like giving cash to SIAI, thinking some arrogant guy is a total genius, or the like. For example suppose you have a solid 1 inch thick steel wall box that you can't see through and inside it you have a traffic light that is either red or green. If you think the wall is looking a little red, that would be (in rationalism) evidence that the light is red. They don't quite make that mistake with a mere steel wall - takes something *really* impermeable to sight like a hypothetical future or alternate universe AI, for it to work. Ultimately it loops back to mysticism where everything is evidence for what it says on the can.
I think you might be treating the steel wall case unfairly. Outside of its cult following Bayes' theorem is a serious tool, and I doubt it would lead to a silly result. Here, let me google Bayes theorem so I can check it. P(light is red given that steel looks more red than green) = P(box looks more red than green given light is red) * P(red light) / P(box looks more red than green) Let's give myself extra super credence and say that there's an extra big correlation between a red light inside the box and a reddish tint. Maybe 0.500000001 chance of the tint being red if the light is red. For example let's say I think I have an incredibly remote chance of getting superpowers from a radioactive spider on my way to view the box. P(A|B) = 0.500000001 * 0.5 / 0.5 = 0.500000001 Okay, so Bayes' theorem says I end up with what I started with, basically no knowledge. Sounds "rational" to me. :) Of course this is not to ignore your point, if I am a techbro with a planet sized ego, I will plug in P(I'm right | I think I'm right) = 0.99999999 and get "I'm right" out every time.
But that's like technically right if you got some evidence, but literally stupid because the slight shift from your typical rationalist's shitty prior to an also shitty posterior doesn't matter (I think? I may be wrong)
the actual process is to pull your posterior from your posterior, then work back to the needed prior
They also make up the prior on the spot, they don't have a pre-pulled-out-of-the-arse number ahead of time. And the "evidence" in question is typically literally an exact zero evidence, because they typically don't have a pre existing expectation to see it or not see something depending on some kind of hypothesis. Let's suppose there's 10 pages of cringe inducing arguments in favor of X. They don't know if that's too many for X being false or too few for X being true, or if the number of pages is directly or inversely correlated with X being true. So really they don't understand that there is no evidence when the observation is not conditioned on a hypothesis.
That's exactly what I meant by shitty posterior
Yeah i mean the “evidence” usually isnt evidence either (ie not even “weak evidence” but not evidence at all), hence posterior is even shittier than prior.
True in which case we are both right 😅🥳
It is actually quite curious. I even got in argument with one of them, who was arguing how conditional probabilities would be unlikely to "balance out", as if those were some kind of physical die toss probabilities. Obviously in real life a: usually you can't compute far enough or b: no information is available so you should be getting expected values of zero on "almost everything" instead of some salad of tiny probabilities of various enormously large basilisks.
A Bayesian analysis based on a bad model can lead your priors further from the truth. So running something through a Bayesian setup isn't guaranteed to be better than doing nothing at all with the data.
Sure, but "I want to believe that X is true," "I am rational" constitute evidence for "X is true" and you choose even priors to make anything you'd like true. If you don't have experimental design, then you can pick-and-choose what counts as evidence to produce arbitrary results. You could also keep a running list of all things you know, and whenever you want to determine something new, assign weights to everything on that list (of course the list contains its own powerset, so isn't a list, but ignore that, metaphysics isn't real,) and recalculate. Somehow I don't think rationalists are defending that method, though.
Instead of saying "[unsubstantiated opinion]," try "[unsubstantiated opinion] is obvious, because 6*7=42"!

There was one group of thinkers who had the tools Jacob needed: proponents of a new philosophy known to its adherents as rationality. These nerdy internet-users were preoccupied with recognising cognitive bias, applying the lessons of biology and statistics to everything from AI research to fan fiction, and modifying their emotions and desires to achieve their goals. While companies were abuzz with ways to “hack” growth and hiring, rationalists believed that they could hack their own minds – jealousy included.

Give me strength.
has any rationalfic literally been written from biology and statistics tho
I mean, they’re not wrong. You can learn to become more rational and you can learn to modify your emotions & desires to achieve your goals. There’s nothing controversial about this. The problem is that the purported examples of this that come out of the ‘rationalist-LW’ community aren’t genuine examples. In fact, the ‘thinking’ that emerges from these communities is often so primitive it makes normal people look away in shame and disgust.* *See: every comment ever made by an LW rationalist that can be honestly paraphrased as ‘I am rational, I’ve told you what the facts are but you still disagree with me. So by Aumann’s agreement theorem, your disagreement with me must be due to dishonesty or an inability to use Bayes’ Theorem.’
Shouldn't you just do mindfulness meditation then? You're not going to think yourself into a good state of mind with rationality.

the How to Train Your Dragon sequels really went off the rails

I just put my head in my hands and shouted a bit because Jesus Fucking Christ why did you expose me to the distillated version of the in-house fucking Economist prose style

There was the cheesemaker. The fashion designer. Three different med-school students. Jacob liked them all. On each date, he holidayed in another person’s world and learned something new.

I hate how the guy in the article talks, too. "She reacted with excitement and curiosity." Like she's an especially exciting specimen, which to him I guess she is.

It’s fair to say that Natasha took to the approach less than Jacob. On his birthday, Jacob had plans to see a woman he’d met on Tinder. Afterwards, Natasha was angry. She told him she couldn’t believe that he’d chosen to spend his birthday with someone else. “Why didn’t you tell me ahead of time?” he asked. “You should have known,” she replied.

Less than a year after they moved in together, Natasha broke up with Jacob. He was surprised. Just weeks earlier, the couple had just signed a lease on their apartment for another year. Though Jacob knew they were having problems, he’d thought they were minor: she was working too much; he was absent minded. She had become an exercise fanatic and installed a dancing pole in their apartment. He was gaining weight. And he was also terrible at reading her emotional signals. After all, he’d been doing everything he could to banish anything irksome from his mind.

lol

Jacob still isn’t fully clear about how things went wrong, but he knows this: one evening Natasha’s fiancé stormed out of their bedroom to scold Jacob and a date for laughing (it was rude to laugh when someone else is in a bad mood, the fiancé said).

All these people are terrible and have terrible taste.

She speaks in careful, grammatically complete sentences while her fingers flick each other under the table. I saw why she would have an advantage in a schema that forces its deviser to extrapolate swathes of personality from a limited set of conversations.

Nice subtle burn.

I’ve never read an article that made me so grateful that I don’t know any of the people interviewed in it.

I’m sorry, I have failed you all, I couldn’t make it past this paragraph:

One New Zealand couple deployed Agile, a project-management system that companies such as Microsoft and Lockheed Martin use to streamline processes across teams, in their marriage. The goal was continuous improvement. They held monthly retrospective meetings, where they reviewed personal successes and failures and set “action points” for the next month-long sprint.

I unfortunately know a pair of Google engineers who had to do weekly 1:1s for a bit. I know a fellow Stanford alum who dumped a super hot guy because he lacked a 'growth mindset.'
someone please save tech workers from ourselves
Eh, no, the world is better off if we all self-destruct.
fair tbh
I don’t see the problem here. There are problems with generalising this approach to all couples. There might be problems with the content of the “action points” and the “success/failure” columns used by this particular couple. But for some couples who want to improve their relationship (or at least maintain a healthy one), this is an obviously excellent idea.
Treating a relationship as a means to an end which can be optimised to achieve that end is so fraught with obvious red flags as to be a massive red fucking canopy spread across every other thing they do.
>Treating a relationship as a means to an end which can be optimised to achieve that end I don’t know how you’re using that phrase here but nothing in my comment requires treating relationships as a ‘means to an end’ in the [Kantian sense](https://plato.stanford.edu/entries/persons-means/) . I’m simply assuming that: 1. There are better & worse ways for a couple to achieve some of their goals. 2. It’s possible for a couple to figure out a strategy for achieving those goals that is optimal (relative to some set of limitations ofc). 3. There is some method to safely record information about 1 & 2 so as to ensure transmission of that information between partners is fast & reliable. 4. The method we’re discussing allows one to do 2 & 3 efficiently (it does). Most people are aware that there are things you can do to increase or decrease the probability that you will achieve your relationship/life goals. Most people try to act on this truth with varying degrees of success. Applying the scientific method to ensure that the journey is as smooth and efficient as possible is a good thing. I don’t see how this is ‘treating people as a means to an end’ except in a trivial sense where all human interaction counts as doing so. Why do you disagree? >is so fraught with obvious red flags as to be a massive red fucking canopy spread across every other thing they do. Q1) Can you give some examples of these red flags? Q2) The use of ‘red flag’ here is ambiguous between *a)* ‘The fact that a person would do this constitutes a reason not to date them’ & *b)* ‘It’s a reason not to date them because it is a morally problematic thing to do or is evidence that they’re morally problematic in other ways’. If you meant the former then there is no contradiction between your comment and mine. This thread alone reveals multiple reasons why this isn’t for everyone. It also reveals that for some people, this is a perfectly rational way to maintain a good relationship. If, as I suspect, you meant the latter, I’ll reiterate my previous question. Why does an activity that all couples do (in some form or another) suddenly become morally problematic once it includes scientific analysis?
It says something about you that your only interpretation of /u/200fifty 's disgust is that it's "morally problematic," as opposed to, "yuck these people sound annoying," or "their relationship sounds like it would be incredibly tedious," or "that would be a mood-killer." edit: as for whether this 'works,' since that seems to be the only metric you care about, if you need a regular check-in, something is wrong with your ability to read your partner and react to them in the moment. That is a much more fundamental problem that cannot be solved by check-ins with clearly legible goals.
fwiw, as someone who already attends sprint planning meetings at work, it was mostly option B ("sounds incredibly tedious") (although it wasn't me who described it as a red flag, I don't necessarily disagree because I think it's probably a sign that other aspects of their lives are also incredibly boring )

This reminds me of that story about the woman getting out of a relationship with an abusive rationalist.

Also, if you catch someone reading The Economist, that’s a red flag. But if you catch someone reading 1843, that’s a whole new colour of flag.

Hey, I used to read the economist! Becuase it was the only magazine they had in the library that actually covered some areas of the world not usually covered by magazines.

Oh my god. Ohhhh my god.

I keep forgetting people like this exist and then these articles happen and I become incredibly embarrassed with my username because readers think polyamory looks like. This. Blegh.

This is not HARD. Communicate. Drink water. Use Google Calendar. Be kind. Don’t expect your partners to read your mind.

OH MY GOD.

and when it LITERALLY TURNS OUT TO BE ABOUT THE RATIONALISTS
Because nothing says "healthy multiple relationships" like LessWrongers (side note: we're coming up on the second anniversary of me outing my abusers after Kathy Forth's death. Congratudolences to me!)

let’s just say I have my own spreadsheets!

Here’s the original post: https://putanumonit.com/2017/03/12/goddess-spreadsheet/

I have two thoughts. One, being able to tell some kind of story about how you ended up with the best of all possible partners is a conventional aspect of courtship and marriage in our society. Everyone has a myth of origins for their relationship: this guy loves numbers so his has numbers in it. That said, given that he eventually told her about the whole scheme and she didn’t run away screaming, he did in fact choose the right woman:

After a couple months, in summer 2015, I told her about Sarah and the matrix. She reacted with excitement and curiosity. That was the first time I knew for certain that I had made the right choice.

I call bullshit. This guy's emphasis is all on finding the right woman for him. There's nothing in there about making himself a better partner, or working on their communication. He's being sneered at because he's relentlessly focused on numerically parameterizing the least important part of making a relationship work. There was an r/relationships post recently where a girl was found to be making spreadsheets about her relationship, and one of the things she was tracking was things her partner had said so she could buy him the best gifts. The reaction to that was AWWW WIFE HER. This guy? His spreadsheet is all about him and what he wants. It is, of course, important to consider your own wants in a relationship, to know your boundaries, needs, and dealbreakers, but he's completely self-centered about it.