r/SneerClub archives
newest
bestest
longest
56

[deleted]

Yeah, it shows they fail at their primary mission of avoiding long tail type of risk events.
[deleted]
> What seems not to have been built in is the possibility that people are straight up full of shit, that they’re corrupt, that the future is untrustworthy not just because things you don’t predict happening happen, but because you make any prediction at all Think that is a pretty common flaw in Rationalism. Some of them are Quokkas and others are pretending to be Quokkas to extract value from the community so to say. The whole focus on ingroups and outgroups (with the implicit assumptions on who can and cannot be trusted), and expelling people who suddenly can no longer be trusted (without then reevaluating(\*) why it keeps happening). To borrow some flowery language from SSC, they will always let everybody into the walled garden, but never consider that they perhaps should check people a little bit better at the gates and create a few more rules to prevent the loggers coming in. Lol looked at the 'other discussions' tab to see what the users of r/ssc had to say. [Amazing](https://old.reddit.com/r/slatestarcodex/comments/yrze7k/the_ftx_future_fund_team_has_resigned/ivxvc0k/) "As long as the "line goes up" phase went on, nobody bothered to ask questions whether crypto has real-world utility rather than being a curiosity among nerds." They live in a cave. \* this isn't not entirely true, there is a lot of evaluating, but not much actions in reaction to this.
>Think that is a pretty common flaw in Rationalism. Some of them are Quokkas and others are pretending to be Quokkas Apparently today I am [one of the lucky 10,000](https://xkcd.com/1053/) \-- not only to learn about the whole [rationalists-as-quokkas](https://www.reddit.com/r/SneerClub/comments/hfq580/we_ought_not_refer_to_the_rationalists_as_rats_as/) metaphor, but also the existence of quokkas in the first place.
Quokkas are cool (I learned of their existence the proper way, by playing [roguelikes](http://crawl.chaosforge.org/Quokka)), it should be noted however that Zero HP Lovecraft is a bad person (who not only missed that Scott only disguised himself as a Quokka) for who the real HP Lovecraft had one flaw, that he wasn't racist enough.
> You see. The time to reevaluate crypto has come, at the exact moment I realized there might be problems with it. That time couldn't possibly have been even one second before now. - me, a rationalist, frequent Updater of Priors
No need to update priors if you just pick the correct ones the first time. (hmm odd, I didn't receive an update for this message until just now even when it was posted 10 hours ago).
> what was built into that model of risk were various assumptions that Bankman-Fried is a fundamentally honest guy who just wanted to make a buck, that the risks themselves were bets on the face value economic case for making money in crypto On the other hand, he literally described crypto as a Ponzi scheme. So I guess he was honest about his dishonesty, or something? > (Matt Levine) I think of myself as like a fairly cynical person. And that was so much more cynical than how I would've described farming. You're just like, well, I'm in the Ponzi business and it's pretty good. > (Joe Weisenthal) At no point did any of this require any sort of like economic case, it’s just like other people put money in the box. And so I'm going to too, and then it's more valuable. So they're gonna put more money in, and at no point in the cycle, did it seem to like, describe any sort of like economic purpose? > (Sam Bankman-Fried) So on the one hand, I think that’s a pretty reasonable response, but let me play around with this a little bit. Because that's one framing of this. And **I think there's like a sort of depressing amount of validity…** https://www.bloomberg.com/news/articles/2022-04-25/sam-bankman-fried-described-yield-farming-and-left-matt-levine-stunned
That’s actually something I kind of respected Bankman-Fried for at the time. I’ve been distantly around corporate finance, investment banking, whatever, socially, for some portion of my life, and alongside my expressed anti-capitalist politics I have a certain shall we say healthy scare-quoted “appreciation” for a certain kind of finance guy who doesn’t pretend that they’re doing anything but seeing an opportunity and taking a bet; like you run into this other kind of guy who’ll say “blah blah blah we have a compliance team that’s half of Manhattan what are you suggesting about impropriety” just because you made a joke about whatever terrible thing happened that week at Deutsche (the guy works at Suisse). It seems plausible *now* to see that quote as indicative of his lackadaisical attitude, but tbh I think it holds up as a statement of intent that within the boundaries set he was going to go after money that wanted to be made. What happened with the collapse is different, or its in a slightly different box, because he didn’t even *try* to play by his own fiscal rules. The game he set up with FTX was you keep encouraging these guys to generate value with their dumb ponzi scheme, you try to keep things relatively organised, and you make enough money off the wins that it trades off well against the overall instability of the market and you make sure you’re already home and dry when the ponzi collapses. But as I understand it when things started looking dicey, rather than see the writing on the wall, he joined the bucket brigade of dodgy loans that were going to prop up the scheme a littler longer, and quickly found himself using client funds to do it i.e. when the ponzi fell apart he was still standing there with his cock out. So yeah, obviously don’t put your trust in crypto, but tbf to the naivety of people who had his back, they only expected *him* to be worth trusting. The image he was projecting was that he would bring common sense and straight-talk to crypto land, and by now we all know how those guys normally talk. Hence the point about taking FTX the economic case for FTX at face value.
On a complete tangent, I was thinking about hypocrisy vs "no pretenses" and how the latter seem like a better thing when all you look at is hypocrisy, but then over time you begin to realize that when the status quo is hypocrisy, and it changes to doing the same but without the pretense, it's almost always a change for the worse. Plus of course often people "tell it like it is" as a lying technique, whereby you freely and openly admit to what someone already knows to conceal something they don't expect. Basically you act honest about the expected level of "shadiness" to conceal outright scam.
The FTX business is an exchange which earns fees from trades, and should be highly profitable! And he used the principles of rationality to justify how to leverage customer deposits to maximize utility in the world, and well look where we are. ​ With this situation, I think it is entirely rational to say that "EA" and "rationalists" are... encouraging a highly unethical mindset! The top nameplate person has acted so unethically he'll be dealing with this for the rest of his life. I heard that el salvador has all their BTC on FTX. Maybe he should give himself up to the feds asap?
as well as risk events close to the y axis
> It’s all very well writing up a complicated decision theory paper about how actually you should only pay special attention to extremely unlikely or unforeseeable science fiction plotlines in order to maximise utility when you’ve never, I dunno, first hand experienced the sort of insane misguided hype and outright fraud that happens every day in finance My impression has been that one of the allures of longtermism is that it's a kind of flight from the uncertainties of reality and this inherently immunizes it from this kind of correction. Statistics doesn't tell us who's going to go bankrupt tomorrow, it just tells us that we all must serve the robot overload that will arise in four thousand years. As a corollary, what happens today or tomorrow -- and so on for every subsequent today or tomorrow -- is utterly irrelevant to our confidence in the coming of our robot overlord.
Yeah, that’s the other read. The big long term risks are big enough that for some people the payoff for working against them is always higher than anything in the here and now, even if they were never going to happen, and it’s certainly more satisfying to work on a project that can never fail in your own lifetime. But I think there’s a certain number of people for whom the appeal is riding a glorious wave into the future (also a “too far off to fail” project, but in a less maudlin key), like I think MacAskill is more one of these people, along with a few fans I’ve run into online.
> But I think there’s a certain number of people for whom the appeal is riding a glorious wave into the future That's fair. Though I think that some of the grandiosity that characterizes the gloriousness of this wave has always been the sentiment that No One Else Gets It and We're Mavericks, and this kind of attitude can feed on setbacks.
I have to admit that the “I wonder” at the beginning was rather a rhetorical aporia I mostly used to get the spiteful backbiting off the ground
I’ll add something to me previous thought, which is something that was in the motivations for my original comment but which I’d forgotten about by the time I replied to you I did consider your point about this all being a flight from the uncertainty of here, today, but in an analogy to Christianity there are those fire and brimstone priests whose ardent servitude is conditional on politics between their cardinal and bishop not interfering with the upkeep of the church hall
My impression of MacAskill has always been negative. He seems in it for the glory — “I don’t want to be just any ‘ol trust-fund philanthropist, so I have to dress up my philanthropy as a contrarian innovation where I’m a savior of humanity” Yet Will’s superior decision-making framework just lead to Will making incredibly poor decisions & missing incredibly blatant fraud? And it lead to SBF committing actual fraud, and hurting people. If you look at what SBF did donate to, almost all of it was to Future Fund entities that Will was on the board of, or political entities that would garner him political power There’s just obviously a lot of self-serving motivated reasoning going on, at every level of EA here It’s very very very hard to believe Will didn’t know things were not on the up & up. Perhaps he didn’t know the full extent — but that’s because he very purposely did not ask
MacAskill started cropping up when I was leaving school and starting university, and there was no getting away from the fact that he was very annoying, and very successful for somebody so young. *Doing Good Better* was hard to fault *as such*, almost as if by design, but already there you have the impersonal ethics (good) extended to dismissing those concerns local to you (bad, and the sort of thing you can only write if you went to university somewhere like Oxbridge - I was in Northern Ireland); before the fans come after me I know he doesn’t say *exactly* that, but look around you at what the movement means now. I wrote a comment yesterday evening wondering whether there had been any longitudinal work done on the overall impact EA and GiveWell have had, what are their results *in terms of the charity sector at large*. I haven’t looked very hard, but what I found were a lot of raw numbers of malaria nets provided (marvellous), children cured of parasites (great) and…monies given. What I find concerning is that if you want to change the shape of the charity sector you should know damn well why it was in the shape it was before, the scleroticism of much of the charity sector is, in many cases, a response to the fraught nature of international development: we’ve seen in recent years that it takes a while for the full consequences of the disrupter mindset to become clear at home, so a lot of people talking about the great raw numbers of EA and GiveWell, straight up just denigrating the charity sector before they came along, as if nobody had ever thought of throwing money at a problem before, with the practical judgement of people like MacAskill at the top…well you see what I’m driving at. It’s not going to stop me from giving to malaria nets, but the spectre of the well-dressed white man with his smallpox blankets is leaning over my shoulder a little closer than before.
Yes. I do applaud them for the malaria net donations. But precisely as you said — there just seems to be an under-emphasis on guardrails against issues with Philanthropy in the past (eg more money going to the philanthropic machine itself, then those being helped)… and an over emphasis on ideas that have a lot of moral hazard baked in I suppose some of the dog & pony show is necessary in the sense that MacAskill isn’t donating his own money — and thus needs to raise funds via a “pitch”
Personally fighting schadenfreude, imagining futurists turned panhandling doomsayers ("The Basilisk is Nigh") and util calculators desperate for a billionaire patron ("Have high IQ, will travel"). So many future utils lost. The AGI will be an eater of worlds. 😞
it will become just another excuse to support fascism

Such effective

More like INeffective altruism, am I right?
Curiously, resigning was the most effectively altruistic thing they could have done.

sure every other crypto company is shady as hell, but not these guys

See, this one was run by somebody in their tribe, so it must be good!

Enron Altruism

A new wrinkle, why are they always like this?

https://twitter.com/excedrinenjoyer/status/1591608620348997632