People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman’s “Geeks, Mops, and Sociopaths in Subculture Evolution” to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that “bednet” effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don’t know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

  • CinnasVersesOP
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    Chapman’s advice seems pretty good for keeping an indy art scene small and for autistic introverts not big and for normies, but not for realizing that LessWrong and EA are cults founded by bad people with bad goals with an exoteric doctrine out front and an esotetric doctrine once you are committed.

    • istewart
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 day ago

      an exoteric doctrine out front and an esotetric doctrine once you are committed.

      What you are describing here is the definition of occultism. There’s different lessons for the “inner door” students, and getting there requires buy-in to the group’s differentiating ideas. The Xenu story in Scientology’s OT3 is a galvanizing popular example, the Catholic practice of adolescent confirmation is a more mainstream example that we’re more likely to have encountered in daily life. To summarize my spiel above with this context, I would say that Chapman’s problem is he thought he could replace the harmful occultisms coming to predominate in Silicon Valley and associated spaces with a kinder, gentler, more scientifically informed occultism. It ain’t worked yet, you gotta give up the whole idea of progressing to a “higher level” or “deeper truth.”

      • CinnasVersesOP
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        occultism

        Another common example for Americans is “milk before meat” among the Later-Day Saints. The paper by Gleiberman above lays out how once you are committed to the idea that altruism should be as effective as possible and that your intuitions about what is effective are not trustworthy, the Longtermists pull you into a dark alley where their friend Pascal is waiting to mug you (although longermist EA never received a majority of EA funding). Its all as sad as when I am trying to have a factual conversation with Americans online and they try to convert me to pseudoscientific racism.