r/SneerClub archives
newest
bestest
longest
shocked that anyone could think Rationalists and Effective Altruists to be lying grifters. Spending the charity money on buying a literal fucking castle *clearly* maximizes the happiness of 10^54 hypothetical future human emulations living in the Matrix (https://www.truthdig.com/articles/the-grift-brothers/)
104

Their own conversation about it is quite entertaining: https://forum.effectivealtruism.org/posts/xof7iFB3uh8Kc53bG/why-did-cea-buy-wytham-abbey

There are some skeptics, but basically the slightest push back - “well, conferences are expensive and we looked at 2 other venues. Buying an Oxfordshire Abbey was just the maxiumum cost benefit option because we are going to run such a great events schedule! This is maximising utility!”

And they immediately all go “oh well that makes complete sense then, stupid normies just can’t do maths!”, even though: a) in no world is this the best value venue b) The “benefit” is running loads- minimum weekly- events with lots of attendees- yet they’ve never heard of the venue! c) this is coming from people who will directly benefit from the use of the building as their workplace

It amazing every time that all the bayesian hyper rational gumph is meaningless in the face of even the most spurious reasoning from the in group.

Virtually overnight, “SBF”, as he is known, was left with about 00 million, although he now reports having only about 00,000 in his personal bank account.

Riches to “rags”

“Get filthy rich, for charity’s sake” lol, these people cant possibly believe this shit

It's in our nature to tell ourselves a story that makes everything we do okay. I used to do that when I was younger and stupider. I would build convoluted narratives that justified this or that thing I was doing, and I would *fully* believe those stories, even though I would never otherwise believe such things had they been told to me by someone else. And this, in my opinion, is also how conspiracy theory works. I think the majority of Qanon folks don't *directly* believe the memes they spread. I think their brains take a circuitous route to those beliefs because they *wish* those beliefs were true. And I think a bunch of overprivileged, overeducated, undersocialized shit-heads building a bitcoin empire out of the Bahamas would do the same thing -- tell themselves a comforting story, and fully buy into it in the disposable present, up until their lifestyle changes and a new narrative is required.
I used to do this too. I still do, but I used to too.
Confirmation bias and motivated reasoning are rationalists' bread and butter.
Cognitive dissonance is a hell of a drug.
Michael Jordan couldn’t win without crushing his opponents, and he couldn’t crush his opponents without being mad at them. So when his opponents hadn’t done anything to make him mad, he would make up scenarios in his head that would justify him being mad, so that he could get mad and crush them. And decades later, he’s still holding a grudge over something he openly said he made up.
Not to get all psychoanalytic, but desire is a central component of all politics. If we could all come to terms with that we'd all be better off (have good desires!). These chucklefucks dressing up their own selfish desires as rationalism is the most insufferable thing imaginable.
It's like the "it is difficult to get a man to understand something, when his salary depends on his not understanding it" quote. If your ability to justify making a bunch of money running a crypto ponzi scheme and using it to live in a mansion where you do amphetamines and play League of Legends depends on believing that maximizes the wellbeing of the 10^54 post-singularity simulated humans, it becomes very easy to believe that. It's not any different than the guy saying "greed is good" to justify his corporate raiding, it's just incomprehensibly dorkier.
Many also publicly attest to following the teaching of Jesus Christ. With enough money, everyone around you will do their most to tell you what you want to hear before you even know what that is, and you'll already be paying them for it.
With enough "high IQ" psychological gymnastics you can rationalize anything
Nah, it's an absolutely perfect thing to tell a certain kind of people. It helps assuage their nagging doubts that getting rich isn't the right thing to do. "But if you get rich you can help people!" smooths that out nicely. It's the most effective kind of propaganda: Telling people to do what they already want to do.
I think they both do and don't. Bet you that if you were part of their inner circle you could have a good laugh with these fake charity nerd girls and boys about how bullshitty and fake it all is, and where in the tinder profile should the wire fraud section go. On the other hand they also would just make themselves believe what ever bs they're spouting for the public.

“I thought there was such a thing as a good billionaire and I was a fool! A FOOL I SAY!”

who'd ever have thought that "doing bad to do good" might involve moral hazards
"Ow! Why did you kick me in the shin!" "For charity!" "How the fuck does that do anything for charity?!" "I haven't figured that out yet, but once I do it'll be genius."
contrariwise, giving rationalists wedgies produces measurable positive impacts here and now

One of the ideas behind EA isn’t that bad, the idea that a lot of money going into normal charities is getting lost/wasted etc on unrelated things to that charity.

But it looks like this crypto scam lost people 51 billion dollars. There is no way ever they can offset this. The only way to rationalize this away as not being a collosal failure and a reason to stop doing anything related to EA because that is a number no traditional charity will ever reach is by going ‘well crypto money isn’t real money, it is only real as soon as you sell and actually spend it on the charity’ but that just cuts out the biggest pillar from under the whole ‘get rich for charity’s sake’ idea. I don’t get how EA people sleep at night with all this on their shoulders. (esp when you also constantly see people on twitter going ’hey friends, sorry to bother you with this, but I got fired/didn’t get enough hours at my job/am disabled but get no help etc etc, I need 500 dollars for rent/emergency etc).

For context, EA in 2021 they got 10 mil in reported donations which they said (asspull sound effect) is prob 1/4th of all total donations. So lets be generous and say that is 40 million per year for the past 10 years. 400 mil compared to a 51 billion loss (15 billion of that was apparently SBFs personal money). What a joke. (note I didn’t read the report fully and just skimmed it for the first total, im prob wrong and im just using Fermi estimates (but it is a Fermi estimate based on EA Fermi estimates so double plus good (lol if it turns out their estimates included future spending of their wealth including the FTX funds, but im lazy and not going to check)))

To be fair, most of the 51 billion was "magic crypto money" or ponzi gains rather than real $ inputs. But the losses are still in billions.
Ah, the justification is simple, you have to think of the Many Worlds where this paid off, which is definitely the experimentally proven interpretation of Quantum Mechanics, and in which Worlds this definitely worked more often than it failed, despite the evidence of what happened.
did a basilisk write this
*hiss*
Yep, this thing basically blowing any of the alleged "gains" of the EA (insofar as they even are gains, etc.) out of the water in terms of scale and then pisses it all down the toilet, making the entire thing a humongous net loss.

They’re bending over backwards trying to justify this on HN right now lol

A lot of people aren't buying it, which is nice to see.

You don’t understand, Centre for Effective Altruism didn’t buy it, it was the umbrella organization. Totally different Don’t mention FTX/Alameda.
https://twitter.com/NathanpmYoung/status/1600877682396389376

it wasn't us, it was our thin plastic shield against criticism!

It’s incredibly frustrating that anyone ever treated these people as serious. Like, the “this’ll help X future people” was always made up, and they didn’t even bother to come up with a good lie either.