• @Architeuthis
    link
    English
    11
    edit-2
    8 months ago

    How did Sam and Caroline get into taking high doses of ADHD medication? We think it was via Scott Alexander Siskind, the psychiatrist behind the rationalist blog Slate Star Codex.

    Siskind occasionally writes up particular psychiatric drugs as public education. One popular piece was “Adderall Risks: Much More Than You Wanted To Know” from December 28, 2017.

    Not to cast further aspersions or anything, but siskind did write a sort of follow up (titled psychopharmacology of ftx or something like that if you feel like googling it) where he explicitly denies ever having met the FTX psychiatrist/dealer, even though a) he admits they actually worked in the same hospital for a time and, perhaps more tellingly, b) no one asked.

    Also according to the birdsite the ftx psychiatrist may have in fact been a huge creep.

    • @maol
      link
      English
      78 months ago

      A sleazy, woman-harassing psychiatrist who gives out dodgy prescriptions is the real face of EA. Just all the negative stereotypes associated with the 60s counterculture/New Left, with none of the redeeming features.

    • @YouKnowWhoTheFuckIAM
      link
      English
      68 months ago

      The little paranoid devil on Siskind’s shoulder screaming horrible compulsive thoughts in his ear is my favourite character in this whole decades long saga

  • @blakestaceyMA
    link
    English
    98 months ago

    I doubted whether it would be a good use of time to read Michael Lewis’s new book Going Infinite about Sam Bankman-Fried (hereafter SBF or Sam). What would I learn that I did not already know? Was Michael Lewis so far in the tank of SBF that the book was filled with nonsense and not to be trusted?

    I set up a prediction market,

    10/10 perfect LessWrong, no notes

    • @carlitoscohones
      link
      English
      58 months ago

      The resulting book review is 28,776 words. It’s 71 pages long in 12 point Calibri with normal spacing.

      • @Soyweiser
        link
        English
        4
        edit-2
        8 months ago

        lol, guess I made a good choice to go ‘nah, not gonna read that’ and closing the tab. That is like 3 SSCs.

          • @gerikson
            cake
            link
            English
            68 months ago

            It seems that SBF is a counterexample to the ideal that an intelligent person can master every domain. Maybe that’s why they are so mad.

            • Mike Knell
              link
              fedilink
              78 months ago

              @gerikson RPGs have been putting intelligence on a different axis to wisdom since forever for a very good reason, but the SBFs of the world clearly never wondered why that is.

  • @gerikson
    cake
    link
    English
    88 months ago

    I’m reading the Zvi piece (https://thezvi.substack.com/p/book-review-going-infinite), which is quite entertaining , but once in a while you stub your toe over the fact that the author is a True Believer

    Putting the $500 million into Anthropic was arguably the most important decision Sam ever made. I do not know if investing in Anthropic was a good or bad move for the chances of everyone not dying, but chances are this was either a massively good or massively bad investment. It dwarfs in impact the rest of his EA activities combined.

    And the fact that SBF’s observation that only $6.5B was spent on political campaigns was ludicrously low is blithely accepted as reasonable, not as an observation that campaign finance is broken.

    • @sinedpick
      link
      English
      58 months ago

      ow fuck! my toe!

      What happened with SBF will happen with an AI given a similar target, in terms of having misalignments that start out tolerable but steadily grow worse as capabilities increase and you face situations outside of the distribution, and things start to spiral to places very far than anything you ever would have intended.

      Ah yes, one day someone will accidentally install the “I’m sorry, I can’t let you do that Hal” plugin. Oops, I let the nuke launch AI override all of our control mechanisms, silly me!

      I fucking hate x-risk people so much.

      • @swlabr
        link
        English
        16
        edit-2
        8 months ago

        Tangent to your point- what would happen if we started misusing tescreal terms to dilute their meaning? Some ideas:

        “I don’t want to go to that party. It’s an x-risk.”

        “No, I didn’t really like those sequel films. They were inscrutable Matrices.”

        “You know, holding down the A button and never letting up is a viable strategy as long as you know how to brake and mini-turbo in Mario Kart. Look up ‘effective accelerationism’.”

        Anyway I doubt it would do anything other than give us a headache from observing/using rat terms. Just wanted to have a lil fun.

        • @Amoeba_Girl
          link
          English
          138 months ago

          i’ll definitely start using “existential risk” for any minor inconvenience, thank you

          • @selfMA
            link
            English
            108 months ago

            there’s significant x-risk in my need to clean my espresso machine conflicting with my extreme laziness preventing me from doing so

          • @Soyweiser
            link
            English
            8
            edit-2
            8 months ago

            If you think you are the only real human alive, all risks are existential. If you die they shut down the simulation. This is why Musk will never fly in one of his own rockets. And my bytes thank him for it.

            • Log 🪵
              link
              fedilink
              68 months ago

              @Soyweiser @Amoeba_Girl Any sim-solipsist worth their processing time would know that even if one instance dies, certain calculations might be memoized and reused in other instances. If the rocket blows up, they can just reuse that sequence on another Musk sim if his rocket blows up, too. If you’re important enough to be the sole protagonist, why not be important enough to have a billion instances of yourself running concurrently in variant simulations?

              • @Soyweiser
                link
                English
                58 months ago

                But are those copies really you? They are copies after all, and eventually your instance might hit a dead branch in which all the actions lead to death and the only non-dead branch of you might be so diverse in different choices it made it can no longer be considered you. That is simply not a risk I can take.

                “This message was send from my padded cell”

                • Log 🪵
                  link
                  fedilink
                  58 months ago

                  @Soyweiser As long as they are enough like me to still be better than everyone else, they pass the narcissism filter. All those billions will eventually have to fail somehow anyway, to determine the grand champion best possible me that will be copied the most for the next round.

        • Sailor Sega Saturn
          link
          English
          118 months ago

          By my calculations, the red light prolonged my commute by 3 minutes, thus costing approximately 54 billion lives.

  • @swlabr
    link
    English
    88 months ago

    At this point, if a rationalist says SBF is “smart” it’s probably out of shame/denial that they got duped by a junkie

    • @swlabr
      link
      English
      68 months ago

      My own self doubt asks: do rationalists feel shame, though?

      • @Soyweiser
        link
        English
        58 months ago

        Some of them yes, or well most of them I gather. They are just people after all.

        • @YouKnowWhoTheFuckIAM
          link
          English
          98 months ago

          My impression is that, as a group, on average, rationalists tend to both feel and repress more intense feelings of shame and guilt than the rest of society can be bothered dealing with, and I say that as somebody who has spent nearly two years doing addiction recovery

          • @maol
            link
            English
            88 months ago

            LessWrong and EA can help people to understand logical fallacies, but they can’t help people to actually understand their emotions. In fact, the culture around them encourages adherents to feel contempt for their “irrational” emotions and for people who are led by emotion.

            Of course it is extremely unpleasant to repress all your emotions, and it is ultimately impossible to do so all the time. How did the LessWrong community solve this problem? Its users limited their emotional expression to acceptable forms and acceptable targets, and expressed their emotions through cult accepted techniques like taking drugs, having sex, cyberbullying leftists and writing really long blogposts.

            Like most subcultures, it’s the powerful and respected people in EA who determine the dominant norms. With pretty much every leading EAist a middle-class dominant-culture American man who works in tech and wishes feminists would quit whining, it should be no surprise that the norms they created are stereotypically, nay, toxically white and masculine.

            • @YouKnowWhoTheFuckIAM
              link
              English
              5
              edit-2
              8 months ago

              Apparently, pace my own username, you don’t know who the fuck I am.

              I don’t think any of that first paragraph is true. LessWrong and EA very blatantly do not teach people how to spot fallacious reasoning. Nor does the culture of either encourage the adherents of their one movement to repress their “irrational” emotions. Fallacious reasoning, emotional reasoning, irrational thinking - all three of these self-evidently ran rampant in the culture, so there has to be something else going on here which would explain both what the culture is like and why you have an impression that seems to line up so squarely with their self-presentation.

              Rather, it seems that what happens at LessWrong and EA is roughly that a charismatic self-presentation of “rational thinking” (with attendant ideas along the lines of repressing one’s emotions and so on) hooks in impressionable people, who - like victims of any multi-level marketing scheme - quickly replace their own styles and habits of thought with those propounded and taught by the movement. So those people do do something like “repress” their emotions, but only in the sense that they repress those styles of thought and emotional presentation which had previously come naturally to them. But of course the movement also teaches that it is right and proper or that there is even a sort of duty to make impassioned (whiny) emotional appeals to this or that privileged source of the right kind of emotions to feel (such as feeling indignant about normie reasoning, or feminism, or whatever), which are (some would say fallaciously) considered above rational criticism themselves.

              You can see that sort of thing play out in basically any rationalist discussion or article at Vox’s “Future Perfect”!

              So what you describe with respect to drugs and so on is true enough but misses the point. It’s rather that throughout the movement there’s a strong current of precisely the things that in its self-presentation the movement is supposed to ward off. The drug scene isn’t an outlet for repressed feelings, it’s just a particular place (of many) towards which the movement’s leaders have directed the energies (which they don’t repress but encourage) of their followers.

              The shame and guilt thing is a separate issue, it has nothing to do with the conscious or directed repression of emotions under the auspices of the movement.

              • AcausalRobotGod
                link
                English
                28 months ago

                Indeed. They teach you to memorize a list of fallacies and biases (with their own weird names and jargon), and then proceed to just do whatever motivated reasoning you want using them as weapons against wrongthink that disagrees with The Rationalist Viewpoint.

        • David GerardOPMA
          link
          English
          78 months ago

          obviously they need to read the sequences more

          • @Soyweiser
            link
            English
            48 months ago

            Obviously everybody does, even this Yud guy, clearly not as smart as the genius who wrote the sequences.

      • @locallynonlinear
        link
        English
        48 months ago

        Paradoxically, I think they literally are swimming in self shame. A lack of processing that shame is why dumping five more pounds of it into their psyche doesn’t effectively alter any part of their behavior.

    • David GerardOPMA
      link
      English
      48 months ago

      Though SBF was correct that if you wrote a book, you fucked up.

      • @swlabr
        link
        English
        5
        edit-2
        8 months ago

        Please explain (I am not well steeped in the SBF lore/tea)

        • @Soyweiser
          link
          English
          78 months ago

          iirc he said something like ‘I have no time for books I dont read them only idiots write and read books nowadays’ (actual quote here) and he wrote a book, or at least had somebody write a book for him explaining himself re the lawsuit, Id assume while his lawyers were outside screaming at him to stop.

          • @swlabr
            link
            English
            58 months ago

            Oh lol I googled it and Michael Lewis, the guy who wrote moneyball and the big short (book ver.), wrote “going infinite”, which required following SBF for “the better part of a year.” Apparently people think it is too sympathetic to SBF. Shame if so, I liked moneyball and the big short (movie ver.)

            • @Soyweiser
              link
              English
              7
              edit-2
              8 months ago

              Well it is the season for previously reliable writers to write fluffbooks (see also Musk) so I would not be surprised if they cash out on their reputation by writing bad books, after all you can’t give good reputation to your kids and grandkids, money otoh.

                • @carlitoscohones
                  link
                  English
                  38 months ago

                  literary fellatio

                  this is new to me and I like it.

                • @swlabr
                  link
                  English
                  38 months ago

                  Ugh, I’m seeing that book (and face) everywhere, especially at the two bookstores I like the most (that is, the ones closest to me).

                  Nobody needs a book on how the author spent a year huffing his current billionaire crush’s farts.

  • @Soyweiser
    link
    English
    68 months ago

    Mowshowitz believes that Bankman-Fried says whatever the person he’s talking to wants to hear. He doesn’t care whether any statements he makes are true or false. Sam only cared about making the number go up — to win at EA as if it were a winnable game.

    Mowshowitz is wrong, and Sam is right here. Remember: rationality is systematized winning

    • @blakestaceyMA
      link
      English
      58 months ago

      “Rationalists should win!” Not whine, win.

      (Wonka voice) Strike that, reverse it

  • @gerikson
    cake
    link
    English
    58 months ago

    OMG he’s gonna testify?! This is best thing that could happen, comedy-wise.

    • @gerikson
      cake
      link
      English
      5
      edit-2
      8 months ago

      Apparently it did happen, and it was hilarious, but it was not in front of the jury (dunno why)

      Thread starts here:

      https://twitter.com/innercitypress/status/1717601265914409209?s=20

      Update: the testimony was in front of the judge only, because there’s disagreement on whether all of it should be presented to the jury. The judge will decide, and then there’s a chance part of the testimony can be repeated in front of the jury, should SBF choose to do so.

      • David GerardOPMA
        link
        English
        48 months ago

        and Sam got out there and shot his mouth off at the jury today, and will do so even more on Monday

  • @JohnBierce
    link
    English
    48 months ago

    Seldom have I seen anyone who has drunk their own kool-aid deeper than SBF.