r/SneerClub archives
newest
bestest
longest
Harvard's student newspaper writes a sneer piece about the local EA club (https://www.thecrimson.com/article/2023/3/30/ea-scrut/)
83

[deleted]

I simply choose to hate both and celebrate when my enemies fight each other.
Don't worry, a whole bunch of the people they interview are former Crimson writers! I was amazed to see the Crimson actually dig up some dirt on them
[removed]
same thing that's wrong with the Yale Daily News. It's written by smug clueless Ivy Leaguers who resemble EY and SA in many ways.

Seeimg the same celebrity as always pop up. This really is getting some scientology vibes (the lw post from a decade ago where one of them praised some of the things he learned at scientology courses (which included post from dgerard explaining why the post was dumb and dangerous) doesnt help for this feeling).

Unrelated, recently the story dropped that Musk stole all the money Twitter employees put into a ‘donate some money to charity via twitter and we will double it’ system. https://twitter.com/ZoeSchiffer/status/1641875618039238656 so musks philosophy mention here is a bit damning.

"pair debugging [human emotion]" is literally an "audit" or whatever from Scientology. Honestly, isn't this shit just exactly decentralized Scientology, with a bunch of little Miscaviges instead of one? AI instead of Xenu? Edit: Ok, to be clear, David is most likely a violent felon and or literal murderer, so, noting the absence of that behavior in the LW crowd. I momentarily forgot how terrible homeboy is.
I don't know, I was just wondering about it. Hard to definitely make a statement like that as Scientology is a cult, EA/LW has a lot of cultlike elements (which are not always harmful), I don't know enough about Scientology (I did once pick up a cheap copy of dianetics, but I think I have put it in the recyling bin). And all this stuff is bad enough on its own without being a Scientology copy. Lesswrongtology does have a bit of the same vibes, Fiction author, weird texts spread around to lure you in (dianetics vs the sequences), pushes weird misunderstood science to further their worldview (a reason why I think you should trash every copy of dianetics you can find, similar to the whole chariot of the gods books). [But perhaps the same could be said of all religions, and mankind ill does a saviour such as these people.](https://www.youtube.com/watch?v=hsPkVWwaDo4&t=26s) (Headphone warning). But yeah, we should be careful with generalizing too quickly and too much. Saying EA is trying to reinvent Judaism and LW is reinventing eschatological Christianity is funny, but we should be careful that we don't start to believe our own hype. (In the same way that LW believes their own hype we sneer for clout, and Yud/Scott/Scott believes sneerclub is created just to harass him/him/him).
One of the defining features of a cult is that it actively discourages people from forming meaningful relationships outside of the cult. By that metric EA seems to qualify, and in fact this crimson piece does a great job of illustrating that. It initially seems hard to explain, in terms of their stated mission, how EAs could simultaneously believe that volunteering in your local community (e.g. soup kitchens) is a waste of time, but going on fun vacations with other EAs is one of the most effective uses of charitable donations. The EAs rationalize this but their rationalizations don't make sense. It all makes a lot more sense when you think about it in terms of cult behavior. Using up all of a person's spare time on ingroup social activities is a classic cult strategy. Things like "hamming circles" sound a lot like love bombing, which is another classic cult behavior. Edit: as we've seen, and as reported in this piece, one of EAs *defining principles* is that you shouldn't prioritize the wellbeing of the people that you personally know over the mission of EA. They might as well just adopt "we're a cult!" as their motto.
Ow that is fair, but those are a bit different arguments than just other things which compare to scientology. The soup kitchen story annoyed me. I knew some people who did soup kitchen for the for the homeless volunteerwork, and iirc getting people to actually scoop the soup into bowls for the homeless is easy, finding cooks who do all the work behind the scenes cooking for people is a harder. Of course this was a local story, and that is another thing, you shoudln't just assume 'do our soup kitchens have enough volunteers' you should be out there locally. And from the stories of the scientology cult post of LW I got the idea that scientology is a lot less 'lovebombing' than other cults. And well, imho the LW sphere is less a cult, and more a sort of cult incubator, where it gets you into the various LW subcults, with EA, MIRI, the houses, the meetups etc. And clearly their 'not a phyg' shirt is answering a lot of questions already answered by...
Yeah, thanks: >I did once pick up a cheap copy of dianetics, but I think I have put it in the recycling bin. is a strong contender for my tombstone.
Sorry I edited the post with more shit later. (I do that a lot, terrible habit of having 'the spirit of the staircase' thoughts). And it was only 50 cents and I mistook it for a science fiction book at first. (I also own one of 'Hubbards' other books, a science fiction short story collection, which is funny because it also includes Hubbards stories (which are meh) and a story from him on writing (which just sucks, and basically was 'include some life experience in your writing') after which various authors are included who called his advice great, parallels to our subreddit subject should be obvious here). In addition to [the far-right edgelord angry puppies guy](https://rationalwiki.org/wiki/Theodore_Beale) it is interesting to see how often bad ideas spread via science fiction and how often they try to get a position of authority (like a writer/publisher).
Iain M. Banks had a copy of Dianetics that they had taken out and shot.
Fair, I think mine was prob recycled into toilet paper or cardboard something. No idea what the end product of recycling is. Same place Robert Galbraith books end up.
last paragraph made me think of Niven and Pournelle lol
I think you are onto something.
It's Francis E. Dec's writings on the Communist Gangster Computer God, with all the race science preserved to boot

Lots of good stuff in here, though I do wish the author dispensed with the veneer of journalistic “evenhandedness” a bit more

– On Jargon

There are a lot of math terms, too. One of the readings constructs a complicated-looking product of derivatives to make the conceptual point that importance, tractability, and neglectedness are all vital considerations.

– On The Robot God Murdering Us All

How do you quantify incalculable destruction? “The Precipice: Existential Risk and the Future of Humanity,” a book by Toby Ord about existential risk, for which Harvard EA runs a reading group, puts the risk of extinction in the next century from unaligned AI at 1 in 10, higher than any other source of risk. Last July, Jurkovic wrote in a comment on the EA forum that “existential risks are high up on the list of most probable causes of death for college aged-people”: Assume that the probability of achieving superhuman AI by 2045 is 50 percent, and assume that the probability of death given superhuman AI is at least 10 percent. Then the probability of death by AI in the next few years might be comparable to around 1 in 6000, he wrote, explaining that this probability is similar to the two largest causes of death for “college-aged people in the US,” suicide and vehicle accidents, although he did not write out the calculations leading to this conclusion.

– On Telling People What To Do With Their Lives (Cult Alert!)

“I love poetry. I loooove poetry,” he tells us. “Will I be going into poetry? No. Because I don’t think it will actually do good for people.” At the moment, Mysoré plans to concentrate in Computer Science and Linguistics.

Computer Science, as well as Statistics and Applied Mathematics, are fairly common concentration choices among the people we meet. Klapper tells us he knows someone in Harvard EA who studies Computer Science and dislikes it, but continues in the field because they believe it’s the most effective use of their time.

– On Being a Sociopath to Get People To Join (Cult Alert!)

Though Jurkovic declined our request to attend a social on the record, we can try to reconstruct the vibe from a guide that he posted on the EA forum called “How to Organize a Social.” Indeed, in the post, he records every step of preparing for a social in granular detail, providing recommendations for everything from grocery lists — CLIF Bars, Diet Coke, several varieties of vitaminwater — to music, such as the Spotify-curated playlist “my life is a movie.” Jurkovic suggests you make it easy for guests to find answers to anticipated questions: “The shoes on/off policy? Where the bathroom is? Where one can get water? What the wi-fi password is?”

Last year, Trevor J. Levin ’19, who is currently on leave as the co-president of the university-wide EA group, also created a list of recommendations for effective retreats: They should happen in the beginning of the semester, when people are less busy; include lots of time for one-on-one interactions and a “structured vulnerable/emotional thing”; and include a healthy mix of new recruits and “moderately charismatic people for whom EA is a major consideration in how they make decisions.” These suggestions were embedded in a long post, which, citing feedback from Ellertson, Davies, Jurkovic, and others, argues that college EA groups should focus more on retreats as a method of bonding.

[…]

“While most of the important cognition that happens is social/emotional, this is not the same thing as tricking or manipulating people into being EAs,” he wrote on the forum. Instead, retreats are meant to appeal to those who may agree with EA on some level but have not yet acted on it, and giving them time to “move closer to the values they previously already wanted to live by.”

-- On Spending Money "Effectively" > We ask him how Harvard EA uses its grant money. > “It’s not my area of expertise,” Jurkovic says. “But ...” He pauses for 15 seconds. “Yeah, just sometimes we get funding for club activities.” > In 2022, we later find out, part of an Open Philanthropy grant was used to send Arete fellows and the University-wide EA group on a weekend trip to Essex Woods, a serene, Thoreauesque venue an hour north of campus that charges about $5,000 per night. According to GiveWell, donating $10,000 to the nonprofit Malaria Consortium could save the lives of five people. > The schedule was similar to that of a corporate retreat: workshops, games, dinner, hot tub, Hamming circles. Well, maybe not the last one. Hamming circles are an activity where three to five participants sit down together and talk through one problem facing each member in 20-minute chunks. It’s “similar to what happens in a pair debug,” a post on an EA-related forum explains. These problems might vary, the post says, from “Is it possible for me, specifically, to have a meaningful impact on existential risk” to “I need to secure $250,000 in seed funding for my startup” to “I’m expected to speak at my father’s funeral and I have nothing but scathing, bitter, angry things to say.” > Open Philanthropy also issued a $250,000 grant for the Centre for Effective Altruism to “rent and refurbish” an office for the Harvard AI Safety Team in Harvard Square for one year. > [...] > “Don’t like including the actual words EA in the name of the space,” Levin (who, for his part, liked “Apollo”) wrote in the comments. “It increases the chances of hypocrisy charges (from people who haven’t thought much about the effects of nice offices on productivity) for getting a nice central office space while ostensibly being altruistic.” > [...] > Three Harvard students, including Davies and Gabrieli, were recipients of Open Philanthropy’s fall 2022 University Organizer Fellowship, for which the organization recommended a total of $3.1 million across 116 recipients. Gabrieli declined to be interviewed for this article. Davies says he doesn’t know if he’s allowed to disclose how much money he actually got, but that he considers the grant to be “an hourly wage,” since he quit previous jobs to focus on developing HAIST. > In February 2022, Open Philanthropy recommended a $75,000 grant to Pollack, the other HAIST co-founder, “to support her work organizing the effective altruism community at Harvard University.” > [...] > Harvard EA is aware that this allocation of money can appear at odds with their stated mission. After the Essex Woods retreat, organizers sent out a feedback form. “How much did the spending of money at this retreat make you feel uncomfortable?” one question asked. > We talk to Levin, the University EA co-president, and he likens it to the way that companies spend money on recruitment. “The idea is that there are problems that are much more talent-constrained than money-constrained,” he tells us. AI safety, a problem that relatively few people are working on, is an extreme example of this, he says. “The question then becomes, ‘Okay, well, if we have money and not people, how do we convert between the two?’” Levin pauses and corrects himself: “My train of thought there sounded kind of like I was saying, well, if you have a bunch of money, what do you do with it, right? That is not what I think.” What he does believe is that physical environments like retreats can rapidly accelerate the rate — by up to 100 times, he writes on the forums — at which people get on board with EA principles. -- On Why Almost Everyone Joining is a White Male > We ask Jurkovic if he’s aware of demographic imbalances within EA groups at Harvard. He pauses. “I think it is quite important to have a community which is welcoming to everyone,” he says. “EA sometimes shares a problem with the cause areas that it tackles” — meaning STEM fields — “which is that many of them have more males in them than average.” > [...] > Although some of EA’s focus areas deal with global health and economic growth in underdeveloped countries, its frameworks generally do not foreground race or gender. A version of the spring 2023 Arete syllabus posted on the Harvard EA website only mentions race in the overview of Week Four: Animal Welfare. > “One of the most important ways we can fail to identify the most important moral issues of our time is by unfairly shrinking our moral circle: the set of beings we deem worthy of our moral concern,” the syllabus reads. “For example, many whites in the US failed to identify that slavery was the moral issue of their age by excluding Blacks from their moral circle. To truly make the world better, we must look beyond the traditional moral horizon for those who are unfairly neglected by mainstream society. This week, we discuss one such group of beings: nonhuman animals.” > We ask Nickols, the Arete co-chair, about this framing. He tells us that it is important to keep the quote “in the context of where it was originally formulated.”
That last pair of paragraphs is a real doozy.
breaking up with my gf like "sorry you're not in my moral circle anymore"
“I expanded my moral circle to also encompass your sister”
[deleted]
It's technically not the worst idea but goddamn that framing.
> Assume that the probability of achieving superhuman AI by 2045 is 50 percent, and assume that the probability of death given superhuman AI is at least 10 percent. Where did you pull these numbers out of?????

love this piece’s slow buildup to crazy dumbassery

One of the highest-profile effective altruists was the former billionaire Sam Bankman-Fried […] His commitment also turned out to be disingenuous: […] he described his apparent embrace of ethics as “this dumb game we woke westerners play where we say all the right shibboleths and so everyone likes us.”

Give me a tinfoil hat if you must but I honestly call bullshit on this. If you’re a true believer in EA and aren’t Big Yud-delusional you would recognize that a disgraced huckster being the head of the movement is horrible optics. Claiming after your downfall that it was disingenuous all along would be a very rational(ist-ic) way to protect EA’s rep.

Cannot understate my agreement with this. Honestly, I'd go a step further. I think that those revelations about SBF's "true opinion" of EA were actually constructed by McAskill and other EA leaders, in concert with SBF. After news of the FTX fuckery broke, they and SBF probably had a call and decided that getting him to completely denounce EA as just a stepping stone in whatever 5D-chess he was playing, would provide the best cover for them against the incoming PR shitstorm. Like \*come on!\* The leaks came out through EA-mouthpiece-using-Vox-as-a-cover Kelsey fucking Piper after SBF "forgot" that maybe speaking to a journalist a day after the collapse spouting scoops left right and centre might be a bad idea. The whole thing smacks of an operation.
The seem to fail to realize the problem that if SBF could get to a position of leadership by mouthing pieties what does that say about the rest of the people in charge of EA?
I'm interested in your point, what do you mean?
If you're an EA true-believer and committed massive fraud to build up the resources to support EA, but then get caught in the massive fraud, you will make EA look really bad. If you're an EA true-believer, you don't want to make EA look bad. If you have no qualms about lying, lying about your belief in EA will probably mitigate some of the damage you did, so they can pretend to have been victimized by you in your fraud (even though they benefited considerably), and SBF has demonstrated that he has zero hesitation when it comes to lying. Idk glove fits IMO

We chat about his interest in folk punk music

Dude managed to miss the entire “thesis” of the genre as at odds with EA.

There's a tiny handful of nazi folk punk bands so...
Fair enough. That's different than their life measured in coffee spoons, "we need 12 diet cokes for this meeting to be a success" type stuff which is more what I meant than "leftist" (tho also that). Like, picture these Harvard rationalists at Plan-it-X

A subject more appropriate for the Lampoon imo.

Effective altruism — efforts that actually help people rather than making you feel good or helping you show off —

Yes, exactly, this is why I’m a communist like a normal person.

Last July, Jurkovic wrote in a comment on the EA forum that “existential risks are high up on the list of most probable causes of death for college aged-people”

Yup this is absolutely a real thing and not at all a doomsday prophecy to keep the rubes scared and in line.

At a Boston-area EA event, for instance, “I’ve had conversations arguing about whether we should kill all wild animals, because they have negative lives,” Klapper says. “An ant colony must just have negative utility in the sense that they’re just not enjoying life, and so it’d be better if we just eliminated them.”

Jesus Hecking Christ don’t cut yourself on that fucking edge.

Even putting aside the “utility” of ecosystems being left to their own designs being something that’s good for, you know, every living thing on the planet… did these galaxy brain geniuses ever consider that living things have an intrinsic value in and of themselves? For all we know ants think being an ant rocks and are living their best lives! Just garrrghlblarglelargle urg

[deleted]
i see reddit is sending us their best again

Effective altruists seek to apply EA principles to personal decisions: what to study, where to work. If you are a college student interested in building EA communities, you might “consider not going to Harvard, as there are a bunch of people there doing great things,” Jurkovic wrote on the EA forum in December, suggesting that going to other colleges without strong EA movements could be better. (Was this something Jurkovic himself considered when applying to Harvard? “No,” he says, laughing.)

“I’m not going to Harvard because I’m a privileged egoistical little shit, I’m going to Harvard because I’m a brilliant future leader who will Do Things, and even though the global utility would be higher if I attended some other college where I could spread my magnificence, I’mma still gonna go to Harvard like the privileged egoistical little shit that I am.”

You will probably work for around 80,000 hours in your lifetime — several of the people we talk to cite this estimate — and you should spend them doing things that count, even if they may not be things you enjoy.

These people are joyless. The echoes of Objectivism are obvious, and funny.

Though Jurkovic declined our request to attend a social on the record, we can try to reconstruct the vibe from a guide that he posted on the EA forum called “How to Organize a Social.” Indeed, in the post, he records every step of preparing for a social in granular detail, providing recommendations for everything from grocery lists — CLIF Bars, Diet Coke, several varieties of vitaminwater — to music, such as the Spotify-curated playlist “my life is a movie.”

Correction, these are joyless people who have no idea how to have fun, but with this detailed list of instructions they now know how to prepare a thing called a p.. paaa… par-ty? 100% vegan of course. DON’T YOU KNOW HOW MANY ANIMALS ARE KILLED PER SECOND IN THE US is a great conversation starter!

These problems might vary, the post says, from “Is it possible for me, specifically, to have a meaningful impact on existential risk” to “I need to secure 50,000 in seed funding for my startup” to “I’m expected to speak at my father’s funeral and I have nothing but scathing, bitter, angry things to say.”

These examples are pulled completely from thin air and do not in any way represent the mental headspace of the average member…

Levin pauses and corrects himself: “My train of thought there sounded kind of like I was saying, well, if you have a bunch of money, what do you do with it, right? That is not what I think.” What he does believe is that physical environments like retreats can rapidly accelerate the rate — by up to 100 times, he writes on the forums — at which people get on board with EA principles.

So instead of being privileged egoistical twats that admit they spent 0k on a fancy scientology-esque retreat for funsies, he now made it into a trolley problem where he’s arguing that the future value of possibly engaging high-value people such as themselves for The Movement is going to be more beneficial for humanity than using that money to save the lives of two kids in Africa right now, so he would let the trolley run over those kids in a heartbeat.

Legacy kids/cash cows aside, Harvard admissions seems to have similar standards as NPR: NO BORING NAMES ALLOWED

Claire Guo ’24, one of the Arete fellows and a former Crimson News editor, wanders up wearing two baseball caps;

What are the logistics of this? How/why would anyone wear two baseball caps at once?

The piece looked decent, but I’m sure “effective altruists” will take it as a sneer 🙄

Anything that doesn't portray EAs as the last heroic bastion between humanity and our robot overlords is gonna be taken as a sneer, let's be real