People connected to LessWrong and the Bay Area surveillance industry often cite David Chapman’s “Geeks, Mops, and Sociopaths in Subculture Evolution” to understand why their subcultures keep getting taken over by jerks. Chapman is a Buddhist mystic who seems rationalist-curious. Some people use the term postrationalist.

Have you noticed that Chapman presents the founders of nerdy subcultures as innocent nerds being pushed around by the mean suits? But today we know that the founders of Longtermism and LessWrong all had ulterior motives: Scott Alexander and Nick Bostrom were into race pseudoscience, and Yudkowsky had his kinks (and was also into eugenics and Libertarianism). HPMOR teaches that intelligence is the measure of human worth, and the use of intelligence is to manipulate people. Mollie Gleiberman makes a strong argument that “bednet” effective altruism with short-term measurable goals was always meant as an outer doctrine to prepare people to hear the inner doctrine about how building God and expanding across the Universe would be the most effective altruism of all. And there were all the issues within LessWrong and Effective Altruism around substance use, abuse of underpaid employees, and bosses who felt entitled to hit on subordinates. A '60s rocker might have been cheated by his record label, but that does not get him off the hook for crashing a car while high on nose candy and deep inside a groupie.

I don’t know whether Chapman was naive or creating a smokescreen. Had he ever met the thinkers he admired in person?

  • swlabr
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    6 hours ago

    Alright, I’ve read the GMS post now. Unfortunately, because I am only coming to it now, ten years after it was first published, and through the framing of a Post-Mortem, whatever charm it may have had over me in its time is not apparent.

    Some thoughts:

    1. No examples. If you’re going to present to me a Grand Unified Theory of Subcultures (GUTS, if you will), show me some evidence.
    2. Post proposes a “lifecycle”, i.e., a description of a subculture’s life from birth to death. He defines/describes birth intuitively. He says death is when the “cool”/cultural capital runs out, and that this is caused by popularity. Sure, except the meaning/value of cultural capital changes over time, especially for any cultural capital produced by a subculture. Initially, the “cool” is worthless outside the subculture; once the subculture gains popularity, the value soars. The contention here is that the cultural bubble eventually pops, tanking cultural capital. Now, the post doesn’t adequately delineate between the loss of “cool” inside and outside the subculture, but I think it’s safe to say the author thinks the “cool” simultaneously evaporates inside and outside of the subculture. I don’t think this is true. Plenty of subcultures experiences booms and busts and live to die another day. This sometimes happens because the subculture doesn’t care about the outside world.
    3. So basically, this post is an economics-flavoured look at subculture evolution. Specifically, it is a liberal critique, and therefore incomplete. It’s fine to bring up different ideas of capital. It’s also fine to point out that subcultures can suffer from cultural colonialism, both in an abstract sense and the real sense (e.g. licensing, IP, funko pops etc). Where liberalism falls short is when it suggests that the solution to problems caused by colonialism is to learn to be capitalist/colonialist in turn. It’s not, evidenced by fucking world history, unless you choose to ignore this fact and continue to be liberal.

    I can see why this sort of narrative might appeal to the rats/incel-coded people. OP has kind of said it all, I think. To add to this, rats love to invent patterns/tropes and pattern match, especially if this means they can pile on assumptions to the thing at hand. Think: sneer clubs, conflict theorists, other names for enemies of the rat community. Yes, the irony that I am doing that here to the rats is not lost on me. At least I’m not putting a name to it! (Pattern Matchers? Regexes?!?!?)

    Obviously, I think a better version of this post would entail:

    1. Explicit acknowledgement of the role of capitalism and colonialist tendencies in corrupting subcultures, and noting that the solution to this is not “subcultures with capitalist characteristics” but explicit anti-capitalism and anti-colonialism.
    2. A flowchart or state transition table that describes all the ways a subculture can evolve. Death is only one possible fate for a subculture; plenty live on in different ways. There is no GUTS, at least in terms of a straight-line narrative of how a subculture lives and dies.
    3. Examples.

    An example to illustrate some of my points (nb I have not thought this out, so it might blow up in my face upon further analysis): Internet piracy. I’d say it’s a subculture that, by its nature, is anti-capitalist and is thriving to this day. It requires an ultimately commercial framework to exist (i.e. the internet), but unless they shut the whole thing down, this is a non-issue. You can’t really sociopathically co-opt the cultural capital here- if you sell the shovels, hey, now you’re part of the subculture too, and those shovels better dig good.

    And finally, RE: the Buddhism. Chapman is apparently an adherent of Vajrayana Buddhism, as opposed to a white-washed/westernised Consensus Buddhism. My upbringing had a Buddhist-influenced backdrop, but I personally never got into Buddhism itself in any appreciable form. That is to say, I couldn’t tell you what Vajrayana Buddhism is myself. That said, I am very familiar with the author’s conception of consensus Buddhism. I will use that term in this thread. I’ll admit that whenever I encounter a Buddhist in the West, I assume they are a consensus Buddhist. It’s a yellow flag for me, in the same way that knowing that someone is into crystals or the zodiac is- it’s not necessarily bad, just different. Not the point. There is a red-flag version of Buddhism to me, and that’s basically any white person who says they are Buddhist but isn’t a consensus Buddhist. Usually, when I encounter this kind of person, it’s some insane, hypercapitalist type with messed-up morality/rationality. So that’s kind of what I went in thinking, and it coloured how I read this.

  • istewart
    link
    fedilink
    English
    arrow-up
    6
    ·
    15 hours ago

    I was somewhat influenced by Chapman myself, so naturally I find it hard to call his efforts a complete smokescreen. I think it’s more a matter of the subculture he’s addressing simply being too damned insular and full of itself. A little less than a decade ago, he seemed like one of the few people trying to help the extremely online think past Yuddite rationalism, EA longtermism, and the incipient weirdo cryptocurrency cults that were springing up. He has expressed being somewhat baffled and bemused by “TPOT,” such as it was, but I think it’s fair to say that his writings were one very important nucleus around which the TPOT social graph coalesced. That said, my impression of TPOT quickly became, and has since always been, that it’s mainly a bunch of people with advanced degrees and/or technical training and experience who are resentful that all that hasn’t given them greater status and influence. Hence the commitment to pseudonymy among many of the bigger personalities; that ship may still come in one of these days. Alex Karp is what TPOT people would become if they had the power they thought they deserved. Thus, up until the current rules-free era, the Bay Area moneymen have been careful to fund very very few of these guys, because they risk bringing the whole edifice down with their severe personal instability.

    The “Geeks, Mops, Sociopaths” article is what’s most commonly passed around, but the foundational material of Chapman’s project is this developmental psychologist Robert Kegan: https://vividness.live/developing-ethical-social-and-cognitive-competence Kegan builds on theories of childhood psychological development from people like Jean Piaget*, and seeks to extend them into adulthood. As Chapman says:

    Most Western adults reach stage 3—the ethics of empathy—during adolescence. However, one needs to be at stage 4—the ethics of systems—to fully meet the demands of modern society. Unfortunately, getting to stage 4 is difficult, and only a minority of Westerners ever do. Kegan suggested that it’s critically important for our society to find ways to support the transition from stage 3 to 4—and I agree.

    Stage 3 in this model finds one conceiving of one’s identity relative to communal relationships such as family, cohort, and local community, while stage 4 has one conceiving of oneself relative to rationally-designed systems of laws and processes, i.e. a modern professional organization. Stage 5 is something that both Kegan and Chapman seem to be conjecturing about and actively seeking, rather than living or cultivating in others. Its ideal is for one to be able to hold the rules of various social systems and modes of interpersonal relation as objects separate from the self, rather than something in which one is irretrievably subjectively embedded, and to be able to gracefully transition between these systems as a given situation demands. Chapman’s Meaningness project is all about building a framework for people to transition from a stage 4 personality to a stage 5 personality, even though the stage 5 personality is as yet loosely defined.

    On Chapman’s suggestion, I read Kegan’s “The Evolving Self,” and at the time it did in fact help me make sense of people I knew who had gone to prestigious schools and attained advanced degrees, but nonetheless allowed themselves to be heavily influenced by woo and toxic spiritualism. But herein lies the pebble under the mattress of Chapman’s program: I was able to understand and deplore these people as stuck in a “less developed” “stage 3” personality and that they simply hadn’t made the most of the opportunities they had with the “stage 4-scaffolding” institutions they had been associated with. But you see, I had the key now, I was able to understand myself as a “stage 4” personality who wanted to make sense of the world via necessarily flawed rational systems, and I’m going to transcend beyond that any day now!

    Better understanding of myself came later; suffice it to say that I’ve gotten more out of the material on resentment and personal accountability in 12-step programs than I have from Chapman and Kegan. The last fucking thing I, or any of these other weirdos, needed was another progressive framework for personal development. The in-built ability to hold oneself up as more advanced or more capable is a fatal flaw for the people Chapman was trying to reach, who already had plenty of excuses to see themselves as superior. Chapman’s biggest vulnerability is that he insists on practicing empathy for people who are at best selectively empathetic, and at worst have abandoned empathy entirely. If he wants to hold onto that as a core spiritual commitment, fine, but it’s been long enough now to reflect that his project so far has basically been a failure. I don’t think he’s lived in the Bay Area for a while now, so I have to imagine his direct interaction with a lot of the big-name “stage 4” personalities he was implicitly criticizing has been limited, but it’s pretty plain to see that there has been no progress towards spiritually reforming the scene that he and they both influence.

    *Jean Piaget was really influential on a lot of the early computer-interaction thinkers like Alan Kay and the Macintosh design team, so having that link is another big “in” for savvier Silicon Valley types.

    • CinnasVersesOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      9 hours ago

      I think one of the biggest flaws of our friends is that they want there to be one hierarchy of power and capability, with Electric Jesus at the top, then them, then their admirers, then the rest of us. Yukowsky is brilliant at getting people to give him money, good at getting them to give him sex, but not a scientist or a skeptic (I am told he asked for special powers to delete LessWrong comments which explain what he got wrong or did not see).

      The “geeks, mops, and sociopaths” model does not encourage people to look at themselves and ask whether their community’s problems are their own fault. It also does not encourage them to ask “I am a drama kid, you are a min-maxer, can we find a way to have a fun game of D&D together or should we find our own groups?”

      Alex Karp’s Wikipedia page has a wild gap from “trying to raise enough money to be a Bohemian in Berlin in 2002” to “senior exec at Palantir with a Norwegian bodyguard and spicy takes on the Gaza war.”

  • corbin
    link
    fedilink
    English
    arrow-up
    11
    ·
    20 hours ago

    Fundamentally, Chapman’s essay is about how subcultures transition from valuing functionality to aesthetics. Subcultures start with form following function by necessity. However, people adopt the subculture because they like the surface appearance of those forms, leading to the subculture eventually hollowing out into a system which follows the iron law of bureaucracy and becomes non-functional due to over-investment in the façade and tearing down of Chesterton’s fences. Chapman’s not the only person to notice this pattern; other instances of it, running the spectrum from right to left, include:

    I think that seeing this pattern is fine, but worrying about it makes one into Scott Alexander, paranoid about societal manipulation and constantly worrying about in-group and out-group status. We should note the pattern but stop endorsing instances of it which attach labels to people; after all, the pattern’s fundamentally about memes, not humans.

    So, on Chapman. I think that they’re a self-important nerd who reached criticality after binge-reading philsophy texts in graduate school. I could have sworn that this was accompanied by psychedelic drugs, but I can’t confirm or cite that and I don’t think that we should underestimate the psychoactive effect of reading philosophy from the 1800s. In his own words:

    [T]he central character in the book is a student at the MIT Artificial Intelligence Laboratory who discovers Continental philosophy and social theory, realizes that AI is on a fundamentally wrong track, and sets about reforming the field to incorporate those other viewpoints. That describes precisely two people in the real world: me, and my sometime-collaborator Phil Agre.

    He’s explicitly not allied with our good friends, but at the same time they move in the same intellectual circles. I’m familiar with that sort of frustration. Like, he rejects neoreaction by citing Scott Alexander’s rejection of neoreaction (source); that’s a somewhat-incoherent view suggesting that he’s politically naïve. His glossary for his eternally-unfinished Continental-style tome contains the following statement on Rationalism (embedded links and formatting removed):

    Rationalisms are ideologies that claim that there is some way of thinking that is the correct one, and you should always use it. Some rationalisms specifically identify which method is right and why. Others merely suppose there must be a single correct way to think, but admit we don’t know quite what it is; or they extol a vague principle like “the scientific method.” Rationalism is not the same thing as rationality, which refers to a nebulous collection of more-or-less formal ways of thinking and acting that work well for particular purposes in particular sorts of contexts.

    I don’t know. Sometimes he takes Yudkowsky seriously in order to critique him. (source, source) But the critiques are always very polite, no sneering. Maybe he’s really that sort of Alan Watts character who has transcended petty squabbles. Maybe he didn’t take enough LSD. I once was on LSD when I was at the office working all day; I saw the entire structure of the corporation, fully understood its purpose, and — unlike Chapman, apparently — came to the conclusion that it is bad. Similarly, when I look at Yudkowsky or Yarvin trying to do philosophy, I often see bad arguments and premises. Being judgemental here is kind of important for defending ourselves from a very real alt-right snowstorm of mystic bullshit.

    Okay, so in addition to the opening possibilities of being naïve and hiding his power level, I suggest that Chapman could be totally at peace or permanently rotated in five dimensions from drugs. I’ve gotta do five, so a fifth possibility is that he’s not writing for a human audience, but aiming to be crawled by LLM data-scrapers. Food for thought for this community: if you say something pseudo-profound near LessWrong then it is likely to be incorporated into LLM training data. I know of multiple other writers deliberately doing this sort of thing.

  • saucerwizard
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    1 day ago

    Total smokescreen, he was a part of TPOT iirc. If the last decade taught me anything it’s that nerds are scum. The poor wittle nerd thing fails to hold up when you can sit back and watch them run scams and shit. Absolutely amoral sociopathic predator types.

    • CinnasVersesOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Was TPOT a Twitter thing? It seems like LessWrong was all over Tumblr and Twitter.

      Most of us are harrmless and just want to explore our special interests. But I don’t think any of our friends fits that description. I don’t think it was just about power games either, Scott Alexander really cares about peddling racist lies, and Yudkowsky seems to build his whole worldview around the idea that he is a world-historical figure (and maybe he is, but Grigori Rasputin not Albert Einstein). So neither the “clueless, losers, sociopaths” model nor the “geek, mops, sociopaths” explains what happened to LW or Effective Altruism.

      • JFranek
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        Was TPOT a Twitter thing?

        If I recall correctly, TPOT literally means “That Part Of Twitter”

        • CinnasVersesOP
          link
          fedilink
          English
          arrow-up
          10
          ·
          1 day ago

          Chapman’s advice seems pretty good for keeping an indy art scene small and for autistic introverts not big and for normies, but not for realizing that LessWrong and EA are cults founded by bad people with bad goals with an exoteric doctrine out front and an esotetric doctrine once you are committed.

          • istewart
            link
            fedilink
            English
            arrow-up
            4
            ·
            9 hours ago

            an exoteric doctrine out front and an esotetric doctrine once you are committed.

            What you are describing here is the definition of occultism. There’s different lessons for the “inner door” students, and getting there requires buy-in to the group’s differentiating ideas. The Xenu story in Scientology’s OT3 is a galvanizing popular example, the Catholic practice of adolescent confirmation is a more mainstream example that we’re more likely to have encountered in daily life. To summarize my spiel above with this context, I would say that Chapman’s problem is he thought he could replace the harmful occultisms coming to predominate in Silicon Valley and associated spaces with a kinder, gentler, more scientifically informed occultism. It ain’t worked yet, you gotta give up the whole idea of progressing to a “higher level” or “deeper truth.”

            • CinnasVersesOP
              link
              fedilink
              English
              arrow-up
              3
              ·
              9 hours ago

              occultism

              Another common example for Americans is “milk before meat” among the Later-Day Saints. The paper by Gleiberman above lays out how once you are committed to the idea that altruism should be as effective as possible and that your intuitions about what is effective are not trustworthy, the Longtermists pull you into a dark alley where their friend Pascal is waiting to mug you (although longermist EA never received a majority of EA funding). Its all as sad as when I am trying to have a factual conversation with Americans online and they try to convert me to pseudoscientific racism.

          • saucerwizard
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 day ago

            Its become more openly right wing since my time. Masks off and all that.

            • froztbyte
              link
              fedilink
              English
              arrow-up
              6
              ·
              1 day ago

              I ran across it when it was still pitching “cozy twitter”, but rapidly also saw that a lot of that was driven by some of the homesteader and natalist types. then after a few rounds of looking into some histories and bigger posters, started backing away… can imagine it’s more mask-off now

  • swlabr
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 day ago

    David Chapman? The guy that ended John Lennon’s career? (jk, looking into this)