r/SneerClub archives
newest
bestest
longest
Scott Alexander Siskind and Caroline Ellison discuss the risks of overleveraging in finance (originally posted Feb 9, 2021) (https://speakertoyesterday.tumblr.com/post/700814996538523648/worldoptimization-slatestarscratchpad-all)
49

damn. i hate when im reading through old stuff on reddit and in the middle of a sparkling, scintillating discussion i find someone has written over all her old comments with nonsense, fragmenting the discussion permanently. what hilarious, moving, romantic, haunting things could she have said? just to wash it all away, in this digital era of permanency? wow. that takes courage. i bet she was really cute, too

It's hard to put my finger on why these people have such egregiously shitty takes. They know enough to know people generally have concave utility of wealth, but not enough to think about why someone would prefer a moderate amount of wealth to a coin flip between infinite wealth and starvation. It's like they're clever, but stupid.
[deleted]
I think this is right on the money. It really is baked into their thinking; the whole idea of "friendly AI" that the movement is founded on is really the idea that human society, morality, desires, etc. can be reduced to math such that an AI can understand them.
they just don't believe zero is a number that can happen to *them*.
One thing I don’t get is how they only ever seemed to consider the value *even to them* in the abstract. There seems to be no calculation for “what will you look like when your entire net worth is gone”. Was the assumption (taking the calculus at face value, which…) that when you start at zero again you haven’t developed something of a reputation? It seems like the *way* you lose all that money should somehow be important.
Honestly this is probably the answer. I know for myself and any other intelligent people that have ever gambled something big, there are ways of convincing yourself of how much of a positive long term payout the non-zero side is.
It's so interesting because a lot of stuffa bout how traditional agriculture and such works is precisely about how important risk mitigation is vs. productivity maximization: IE: If you have to reduce your productivity by 50% to ensure you don't fail completely every ten years *you take that productivity loss becuase otherwise you die*. So much about the "innefficiencies" of pre-industrial agriculture makes sense when you learn that.
I'm not sure I'm familiar with said inefficiencies, can you elaborate?
Stuff like growing a ton of different crops rather than specializing (because in case the wheat harvest fails you might survive on cabbages) the entire medieval tendency to divide farmland into thin strips and spread them out (so everyone has a piece of every type of land) so in case something ends up fucking with one particular piece of microclimate you can still hopefully live off the others, the relatively high proportion allocated to pasture or fallow land, etc. (both as risk mitigation and because you need the fertiliziers, there's an entire complicated set o incentives fighting over whether or not you should grow grain (which provides higher yield) or livestock (which is much less efficient but also helps create those higher yields because they give fertilizer....) etc. etc. Like, theoretically you might be better off growing say, wheat (or whatever the highest-yield crop is) but becuase there's a limit to how much surplus you can save people instead diversified and grew all sorts of crops with less yield (and also losses becuase growing several different crops require a whole bunch of different management skills and such) because mitigating the risk of having *no* food one year is way more important than getting a higher yield in the years when you're doing well anyways.
Aha, I see what you mean - I don't normally think of those practices as "inefficiencies," but instead as (self/community) insurance + long-term soil maintenance. You're entirely right though. If I recall correctly, storage models suggest pretty similar lessons. Grain silos help keep prices higher in good years, as you can always build more storage and store more grain - but you can never take out grain that you don't have.
See, the thing is that they figure out how to lose other people's money on the coin flip and because they look smart people keep giving them money to do it.
The first part of this quote reads like someone on the verge of independently inventing the Martingale strategy.
This is literally it. Sam and Caroline embraced Martingale as the only logical approach to trading.
SBF was on the record about it too: https://forum.effectivealtruism.org/posts/mdbSL9o8H2Z5mjKaj/fermi-paradox-and-the-st-petersburg-paradox
[I mean...](https://forum.effectivealtruism.org/posts/mdbSL9o8H2Z5mjKaj/fermi-paradox-and-the-st-petersburg-paradox),
> your downside is bounded at your entire net worth I find this hilarious because the downside was in fact bounded by the net worth of their customers deposits.
How could anyone simultaneously endorse ‘utility is a function of log-wealth’ and ‘EA endorse double-or-nothing coin flips’ …
Because they don't think that utility is a function of^* log-wealth. They think that foolish, non-rational, people believe this but they, the cleverest little boys and girls, know better. *: The mathematician in me can't help but point out how silly this construction is. Any function of wealth is a function of log-wealth too and vice versa, because logarithms are invertible and you can just exponentiate log-wealth to get wealth. I'm sure there are ways to caveat the function you use to make this sensible, but on its face this is a silly thing to say
U = ln(W) is pretty common, but really that functional form can be anything as long as it's concave.
[This blog post](https://sarahconstantin.substack.com/p/why-infinite-coin-flipping-is-bad) written by what I believe is a fellow EA goes into a lot of detail about why this is dumb.

I mean, that’s how it should work. But in the specific case of FTX, Alameda was – for mysterious reasons we’re all still trying to figure out – special cased in the software to be exempt from margin calls.

The mysterious reason is crime.
that sounds like the title of a decent anime
It had surprisingly deep characterization for what could have been your standard heist story, and the English dub really dodged a bullet by not casting Vic Mignogna as Inspector Ferrite.
>“I think it might be time for Alameda Research to shut down. Honestly, it was probably time to do that a year ago,” Bankman-Fried wrote in a document titled “We came, we saw, we researched.” what exactly did they "research", how to steal money?
SBF admitted the name was to make it easier to receive wire transfers, because banks didn’t want to provide any services to crypto companies
Ahhhh that explains a couple of things

Found this though an archive of Caroline Ellison’s tumblr, available here: https://caroline.milkyeggs.com/worldoptimization

Scott also responded to her directly in another post: https://slatestarscratchpad.tumblr.com/post/138519462631/do-you-think-there-is-something-about-rationalism

Milky eggs?
idk, I didn't choose the name

Is scott pissing himself in terror yet

The SEC's Basilisk
that has been happening daily for a long time

of course he made that painting his avatar lmaoooooo

edit: that ain’t Scott, I got got

Just for reference: speakertoyesterday is not Scott (afaik), they're just someone who was quoting the thread. I linked to their version because references to Caroline Ellison's defunct account have been removed from the existing post on Scott's account, and Caroline Ellison's original post is no longer accessible (but the quoted version is both available and still has a reference to Caroline's account for some reason, I guess that's just how tumblr works). FWIW [Scott's account](https://slatestarscratchpad.tumblr.com/) uses art from the Codex Seraphinianus, along with whatever that necklace is.
Very cool and relevant that Codex Seraphinianus is on the surface a fascinatingly dense and mysterious arcane object, which upon closer investigation is revealed to be the not all that obscure doodling of a hypermanic industrial designer with mediocre taste in fantasy tabletop RPGs Don’t get me wrong, the disjunct here is I think the Seraphinius rocks, but it ain’t that deep
What's the fantasy tabletop rpg in question? Casual internet searching revealed nothing to me
None in particular, it was a broad reference to some of the artwork and to Siskind being the kind of guy who’s into mediocre tabletop RPGs I make no bones about being an awful nerd, albeit one who recently acquired a very sexy black turtleneck, but I get to lord the fact I never got into D&D etc. over my fellow dorks, along with my humanities degrees, 20/20 vision, and hard copy of the collected poetry of Ayatollah Khomeini I will admit I was *briefly* into Warhammer 40k as a child, and do own one (1) black cape
Ah thanks for the clarification. I thought it might have been one of his alt accounts or sth.

If you ever come up with an infinite money hack, you’re wrong.

Also, like, who the fuck is going to sell you an infinite or 1000x leverage on a futures trade without liquidation and margin call guarantees? They would have the same EV as you but in negative and so they would never do it lol.

Well if you're Caroline Ellison, the answer is [your boyfriend's crypto exchange](https://www.fxstreet.com/cryptocurrencies/news/bankman-fried-lawsuits-expose-ftxs-special-treatment-of-alameda-research-202212140020)

Maybe I understand futures even less than they do, but I was under the impression your losses could be far greater than your investment, ie you can go into debt if things really go south

That is shorts.
True had those mixed up