More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

  • @sc_griffith
    link
    English
    1
    edit-2
    6 months ago

    by causal link, I mean how does banning nazis cause support groups for non-offending pedophiles to get banned. like how does that actually happen. please be as specific as you can be

    • @Sanyanov@lemmy.world
      link
      fedilink
      English
      1
      edit-2
      6 months ago

      I see.

      It’s not banning nazis directly causing banning non-offending pedophiles, it’s banning people considered dangerous causing both, with Nazis just setting the precedent (because obviously they are bad, and there’s little disagreement). Fedi is just one example where banning Nazis is not full stop. Other groups are banned too, sometimes without much consideration, and this happens on many different platforms - Tumblr, Discord, Facebook, and even daddy Elon’s Xitter, to name a few.

      This goes as part of my argument on why we need spaces with completely free speech. We cannot expect instance admins or even platform owners to be completely objective in their estimations of right and wrong, and we can’t trust them to be unaffected by societal stereotypes.

      Moreover, even in such an ideal scenario where they are fully objective, their userbase might think differently, forcing admins to take measures against various marginalized groups.

      At that point, it seems to me like the only way out of this conundrum is having some platforms - not mainstream ones, mind you - allowing everything: platforms, from which positive, but initially rejected ideas can spread.

      • @sc_griffith
        link
        English
        26 months ago

        nobody but nazis wants to be on those lol. go post on gab or whatever if you want that. it’s free. you can do it. you just don’t actually want to

        • @Sanyanov@lemmy.world
          link
          fedilink
          English
          16 months ago

          Why would I want to post anything on Gab, a far-right platform?

          I hoped we’ll keep on with sensible conversation.

          Substack, on its part, is used by various authors and is absolutely not limited to Nazis.

          • @sc_griffith
            link
            English
            16 months ago

            the site you are imagining, the supposed free speech site? it converges to gab. this dynamic is basic and I can’t take you seriously if you don’t get this.

            • nazis are encouraged to be equal voices on a platform
            • they use the platform’s reach to radicalize fence sitters
            • other users, realizing their digital roommates are Nazis, are alarmed and leave
            • now it’s a nazi site

            what exactly do you think substack will consist of in two years if they don’t do a 180? the entire reason we’re having this conversation right now is that a bunch of substack writers said they would rather leave than hang out with nazis