awful.systems
  • Communities
  • Create Post
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
Appoxo@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 年前

Cloudflare plans marketplace to sell permission to scrape websites

techcrunch.com

external-link
message-square
10
link
fedilink
  • cross-posted to:
  • technology@lemmy.ml
155
external-link

Cloudflare plans marketplace to sell permission to scrape websites

techcrunch.com

Appoxo@lemmy.dbzer0.com to Technology@lemmy.worldEnglish · 1 年前
message-square
10
link
fedilink
  • cross-posted to:
  • technology@lemmy.ml
Cloudflare's new marketplace will let websites charge AI bots for scraping | TechCrunch
techcrunch.com
external-link
Cloudflare announced plans on Monday to launch a marketplace in the next year where website owners can sell AI model providers access to scrape their
alert-triangle
You must log in or # to comment.
  • gaylord_fartmaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    48
    ·
    1 年前

    They’re already ignoring robots.txt, so I’m not sure why anyone would think they won’t just ignore this too. All they have to do is get a new IP and change their useragent.

    • redditReallySucks@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 年前

      Cloudflare is protecting a lot of sites from scraping with their POW captchas. They could allow people who pay

  • scarabine@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    25
    ·
    1 年前

    I have an idea. Why don’t I put a bunch of my website stuff in one place, say a pdf, and you screw heads just buy that? We’ll call it a “book”

  • umami_wasabi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 年前

    How can I do this without Cloudflare?

    • Rikudou_Sage@lemmings.world
      link
      fedilink
      English
      arrow-up
      22
      ·
      1 年前

      Put a page on your website saying that scrapping your website costs [insert amount] and block the bots otherwise.

      • gravitas_deficiency@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 年前

        The hard part is reliably detecting the bots

        • melroy@kbin.melroy.org
          link
          fedilink
          arrow-up
          5
          ·
          1 年前

          Also you don’t want to block legit search engines that are not scraping your data for AI.

          • gravitas_deficiency@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 年前

            Again: hard to differentiate all those different bots, because you have to trust that they are what they say they are, and they often are not

            • melroy@kbin.melroy.org
              link
              fedilink
              arrow-up
              5
              ·
              1 年前

              Instead of blocking bots on user agent… I’m blocking full IP ranges: https://gitlab.melroy.org/-/snippets/619

              • vinnymac@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                edit-2
                1 年前

                It certainly can be a cat and mouse game, but scraping at scale tends to be ahead of the curve of the security teams. Some examples:

                https://brightdata.com/

                https://oxylabs.io/

                Preventing access by requiring an account, with strict access rules can curb the vast majority of scraping, then your only bad actors are the rich venture capitalists.

Technology@lemmy.world

technology@lemmy.world

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !technology@lemmy.world

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


  • @L4s@lemmy.world
  • @autotldr@lemmings.world
  • @PipedLinkBot@feddit.rocks
  • @wikibot@lemmy.world
Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 3.52K users / day
  • 7.65K users / week
  • 13.2K users / month
  • 29.2K users / 6 months
  • 19 local subscribers
  • 77.7K subscribers
  • 15.9K Posts
  • 630K Comments
  • Modlog
  • mods:
  • L3s@lemmy.world
  • enu@lemmy.world
  • Technopagan@lemmy.world
  • L4sBot@lemmy.world
  • L3s@hackingne.ws
  • L4s@hackingne.ws
  • BE: 0.19.12
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org