r/SneerClub archives
newest
bestest
longest
78

It’s like Christmas. Acausalrobotgodmas.

i for one want a study to be done now about the effectiveness of their malaria net obsession in actually reducing malaria, since millions of nets is enough to show an observable effect

if it is effective they’ll go back to focusing on non-sci-fi bullshit, if it turns out to be ineffective it will be a 360 no-scope ending EA, win-win

There have been a ridiculous number of studies showing malaria nets to be effective. This [cochrane review](https://www.cochranelibrary.com/cdsr/doi/10.1002/14651858.CD000363.pub3/full) covering like 20 studies with 200 thousand participants concludes they save about 5 lives per 1000 nets. Malaria nets are great, which makes it all the more tragic that these people are as a whole deciding to neglect them in favour of AI doomerism.
i know nets work i was asking if there was a study to check if the millions of nets that are said to have been donated actually helped reduce malaria in those regions
Iirc they used the nets coated with toxins to fish so they already did the study, and prob spun up an education program in combination with the net program.
There is a [huge amount of evidence](https://www.cochranelibrary.com/cdsr/doi/10.1002/14651858.CD000363.pub3/full) that malaria nets reduce child mortality. There is definitely a problem with malaria net fishing, but that doesn't really cancel out saving the lives of thousands of children.
Ow yes it certainly is good, just not as simply cheaply good as the EA people make it out to be. The fishing can be easily fixed by education and giving people access to real fishing nets for example. And I assume they do the former at least. I didnt mean my comment as an anti net giving comment but more an 'just give 5 bucks for nets save a life, bing bong super simple, im a better human than you because my 20 bucks saved 4 people'. And to show that the actual effect had been studied and the results werent 100% positive (but as you point out they are positive)
Yeah, it also points out that charitable giving on it's own can only do so much, if these countries weren't so impoverished from a long history of exploitation they wouldn't have to choose between fishing and malaria.
Yeah I think a lot of problems could be fixed politically if a few big countries just went 'ow that historical debt we gave you (iirc Haiti is a scandalous example) for exploiting your/or because you freed yourself' that is now gone, and we will even repay some of that (without a requirement to buy our products)' would go a long way in fixing things. Sadly that involved the to Rationalists hated P-word.
P-word?
~~pussy~~ [Politics.](https://www.lesswrong.com/posts/9weLK2AJ9JEt2Tt8f/politics-is-the-mind-killer) (and yes, in their crusade against biasses they created a bias against talking about, or understanding politics. I hope the irony isn't lost on them)
For me the bigger issue is that an effective altruist organization like GiveWell, which prides itself on its impact-focused rationality, has discredited itself by e.g. doubling down on [deworming initiatives](https://katemanne.substack.com/p/against-swooping-in?sd=pf) even when these were shown to be of dubious value. It’s great if malaria nets are actually helpful, but I wouldn’t use any effective altruist organization as the sole arbiter.
Third and most likely route is worm wars 2.0
I’m an MS student in entomology. I work in a lab that studies hematophagous arthropod disease vectors. Although my thesis is on ticks, I raise Culex and Aedes as part of my regular routine to contribute to the work of the lab. My PI published a definitive study showing how DEET actually works (had been a subject of great controversy) in PNAS. So if you’re ever going to take something on authority: LLINs save lives and there are already plenty studies.
> LLINs save lives and there are already plenty studies. what i commented was “did the millions of nets donated by Against Malaria Foundation have any visible material effect against malaria” not “do malaria nets work” I checked malaria rates for random countries that are covered under the map in the World Bank database and i see few U-shaped curves and many downward sloping curves, but then again those don’t control for infrastructure improvements or war or climate

What did I miss? I’m out of the loop.

To add to acausualrobotgod’s comment, a lot of rationalists are complaining about hindsight bias and how could anyone have possibly known and how were they supposed to do better than all the investors that put money into FTX (because markets are magic so obviously if investors are putting money into it it must be a good bet). When in fact, to any outside observer familiar with cryptocurrency (ie the entire subreddit /r/buttcoin), FTX looked exactly like all the other crypto exchanges that collapsed under the weight of the scam (depending on the crypto exchange some mix of gambling with user’s money, getting hacked, and/or “deliberate” exit scam). So their “prior” on “FTX collapsing due to some malfeasance or incompetence” should have been really high. But in fact, the rationalist community mostly use the word “prior” for opinion/bias they were already holding and don’t want to examine further while also wanting to put fancy words on it.
Well obviously no _true_ EA would commit SBF‘s mistakes; it all just shows EAs need to be more devoted to the dictums of rationalism than ever before and, if pure enough of mind, acausal AGI willing, we may yet salvage at least some 10^35 ems in the promised futarchy!
Hi, I’m new to this shit show. What is an EA?
“Effective altruists”. At the (relatively) sane end, it’s trying to determine how efficient and effective charitable donations to different causes and/or charities are. I think the idea commonly associated with them (to the point of being a meme) is that a mosquito net worth a few dollars can literally save someone’s life in areas of the world where malaria is common. At the crazy end, they allow small probability high impact distant future events (ie AI strong fixing or destroying the world) to dominate their concerns. One of the major EA donators, SBF, is the person at the center of the FTX scam. He indirectly used EA reputation to launder his own reputation and was motivated by the idea of “earning to give”. Earning to give is the idea that you should get rich to donate to EA causes… which may sound nice or at least neutral but it overlooks how jobs can be unethical, like imagine a lawyer making lots of money defending horrific causes or in this case, a crypto scammer earning lots of money via obviously scammy methods setting up an entire set of grants (that now lack funding).
Thanks for explaining
An “effective” “altruist”, i.e. a rich fraud willfully incapable of seeing what‘s in front of their eyes. The whole movement is a recasting of Dickens‘ [telescope philanthropy](https://www.thecircumlocutionoffice.com/bleakhouse/charles-dickens-telescopic-philanthropy/) in the digital age, with a dash of eugenics sneaked in for good measure. In other words, nothing new under the Sun.
Thanks
A big effective altruist (SBF) was leading a big crypto ponzi scheme (FTX) that collapsed. Everybody's ragging on EAs, rationalists, cryptocurrency, and it's glorious.
I had no idea that FTX was run by an EA, that's hilarious. What a shitshow.
And what was EA's relationship with FTX exactly? Were they investing donation money with FTX, or was it just that they accepted donations from SBF?
The guy behind it all was an EA, birthed from the movement, it was his guiding light, and he had it as a huge part of his public profile. He gave generously to the EA movement and was its face for a while.
And arguably, his EA persona was instrumental in securing credibility, investment, and good PR for his activities.
MacAskill tried to get him an "in" with Elon, it's likely he made other introductions for him as well.
He was employed by them before leaving for crypto. Then he became their biggest donor and heavily used EA as a sort of brand to bathe himself in an aura of benevolence. He lived in the most expensive penthouse in the Bahamas, but somehow that was supposed to fit with effective altruism and wasn't like Peter Singer's example of skipping buying a new pair of boots to save a life. To kind of whitewash that he would sleep on a beanbag in the office when press was visiting and drive a prius. The CEO he hired for the proprietary trading arm of the operation left earlier this summer to live on an effective altruist yacht.
SBF was recruited to EA by William MacAskill personally. MacAskill now feels utterly betrayed.
that’s honestly hilarious I’m light of MacAskill’s thread the other day, I didn’t realize they were THAT close
And MacAskill opened doors for him or at least tried to.
Theee have also been reports of orgies (or at least they apparently got laid a lot) large amounts of property veing bought in tax havens etc etc. People trying to flee to non extradition countries, one of the imporrant people was a scammer before (iirc he even tried to 'if we get caught we blame a insider hacker' before (and guess what they also blame hackers right now). Wonder if we also will get stories of massive drug use to complete the story. And the EA people are scrambling to save face and pretend this is not something they coupd be blamed for.

To me it’s clear as day that SBF’s malfeasance and sheer evil is a direct line consequence of the philosophies of EA and rationalists. It’s a huge indictment of the community and the thoughts that have arisen from their philosophy.

The focus on detached utilitarism, the weird “risk” calculus that is always wrong, the insistence that aggregated probability ends justified the means. It’s all garbage … because it’s attempting to reinvent thousands of years of ethics from first principles.

EAers we’re trying to argue the world would be better if they were in charge. Well they were in charge of a small segment of it. And they managed to perpetuate the largest financial fraud of the century. Bigger than Enron.

And you can go in about how cryptocurrency is worthless blah blah but many many people had what they thought was hard USD on deposit and was fully expecting to get it back. Nope. SBF stole it.

>It’s all garbage … because it’s attempting to reinvent thousands of years of ethics from first principles. Why do you believe this?
Perhaps because he's read The Sequences where they shit on all prior philosophy and try to do this.
"Okay, we're going to start out by shitting on everybody before us and everybody who disagrees with our cranky stance on everything. Then demand a principle of charity for anybody engaging with our work."
Exactly. Also lots of “hurdur how can we understand intent and the mind of a person?” - uh it’s called legal philosophy and they deal with this every day in courts. Seriously reinventing everything from first principles - especially when you have no prior knowledge of various fields - is slow and error prone.

This is just a very small cherry on top, but up here in Seattle the EA/rationalist-backed approval voting proposition is clearly going to lose, and it looks like the emergency RCV voting proposition (it’s complex) is going to pass for good measure.

Yes! I was following that. feelsgoodman.jpg
[feelsgoodman.jpg](https://i.imgur.com/GZf4Mv3.jpg) --- *^(Feedback welcome at)* [*^(/r/image\_linker\_bot)*](https://np.reddit.com/r/image_linker_bot) ^(| Disable with "ignore me" via comment reply or) [^(inbox message)](https://www.reddit.com/message/compose/?to=image_linker_bot&subject=Ignore%20request&message=ignore%20me)^(, bots can't read chats)
What's the problem with approval voting? It seems... weird.
It’s not awful; I think it’d be an improvement over FPTP. However, you have to decide what approval means for you, because for every person you approve of past your top choice, you’re running the risk that you’re decreasing your top choice’s chances of winning. This isn’t that bad if you’re capable of being completely rational about voting but we live in the real world.
There are polities where I can imagine approval voting being alright, I can imagine that it is shockingly awful in lots of circumstances
That sounds like RCV but worse
I don't think I have all the context here, but I thought approval voting was quite alright by itself, and maybe even better than RCV? https://ncase.me/ballot/#:~:text=the%20further%20to%20the%20right%20a%20voting%20method%20is%2C%20the%20more%20it%20%22maximizes%20happiness%22%20for%20the%20voters. Or here is the full article going into the nitty gritty: https://ncase.me/ballot/ What the chart with the blue bars shows me is that even in the worst case for approval voting, it still does better than the best case for RCV (in the chart it's called IRV). Or am I missing something? EDIT: Some more info comparing all three: https://www.youtube.com/watch?v=yhO6jfHPFQU
I mean you just posted the same article twice, which is making a case for approval voting as the best second to a non-RCV alternative, I would be surprised if you didn’t come away with the impression that approval voting was better
OK hold up now seriously, that article just outright claims Ralph Nader is why George Bush Jr beat Al Gore in 2000, what the fuck are you reading, same for Trump in the 2016 Republic primary
The chart they’re using literally comes from the same rationalist foundation that backed approval voting in Seattle. The “Bayesian regret” it refers to isn’t actual sentiment analysis, it’s a measurement of how close the winner of the election is to the preferences of the voter. The problem with approval voting is this: you have three candidates. Candidate A is very much your jam. Candidate B is someone you dislike. Candidate C is someone you could live with. They’re tied in the polls. 10% of the electorate agrees with you. You sigh and mark A and C. C wins the election over A by 5%. Whoops. Approval voting strongly decreases your ability to understand the effect of your vote. This is bad for voter morale.
OK come on man, that article just has it in for RCV, like the whole play is to make RCV look weirdly awkward and complicated even in the diagram it uses to compare RCV with score and approval voting
Approval voting was rationalist bullshit? I thought it was just another attempt to get rid of Sawant.
Yep, sure was. The Center for Election Science is an effective altruism group, and SBF donated to the Seattle initiative directly.
I’m very confused by this sentiment, like, what does rationalist bullshit mean? Isn’t that an oxymoron?
More of a tautology tbh

Merry Argmas!! Wait a sec- ARG?? 🤯 Praximus_Prime, is that you?

Sometimes these things just happen for no reason!