about 50% of traffic to programming.dev is bots who have marked their user-agents as such. I’m pretty confident the actual number is higher, but haven’t spent time validating.
WOW. Source for this?
Snowe is sysadmin of programming.dev…
So source: Snowe
Oh thanks lol
while others could be executing real-time searches when users ask AI assistants for information.
WTF? Is this even considered ai anymore? Sounds more like a Just-In-Time search engine.
The frequency of these crawls is particularly telling. Schubert observed that AI crawlers “don’t just crawl a page once and then move on. Oh, no, they come back every 6 hours because lol why not.” This pattern suggests ongoing data collection rather than one-time training exercises, potentially indicating that companies are using these crawls to keep their models’ knowledge current.
Whats telling is that these scrapers aren’t just downloading the git repos and parsing those. These aren’t targeted in anyways. They’re probably doing something primitive like just following every link they see and getting caught in loops. If the labyrinth solution works then that confirms it.
Lol the article 403s with my VPN on.
you evil AI you! /s






