Unlock stock picks and a broker-level newsfeed that powers Wall Street.
Open source devs are fighting AI crawlers with cleverness and vengeance

In This Article:

AI web-crawling bots are the cockroaches of the internet, many software developers believe. Some devs have started fighting back in ingenuous, often humorous ways.

While any website might be targeted by bad crawler behavior -- sometimes taking down the site -- open source developers are “disproportionately” impacted, writes Niccolò Venerandi, developer of a Linux desktop known as Plasma and owner of the blog LibreNews.

By their nature, sites hosting free and open source (FOSS) projects share more of their infrastructure publicly, and they also tend to have fewer resources than commercial products.

The issue is that many AI bots don’t honor the Robots Exclusion Protocol robot.txt file, the tool that tells bots what not to crawl, originally created for search engine bots.

In a “cry for help” blog post in January, FOSS developer Xe Iaso described how AmazonBot relentlessly pounded on a Git server website to the point of causing DDoS outages. Git servers host FOSS projects so that anyone who wants can download the code or contribute to it.

But this bot ignored Iaso’s robot.txt, hid behind other IP addresses, and pretended to be other users, Iaso said.

“It's futile to block AI crawler bots because they lie, change their user agent, use residential IP addresses as proxies, and more,” Iaso lamented.

“They will scrape your site until it falls over, and then they will scrape it some more. They will click every link on every link on every link, viewing the same pages over and over and over and over. Some of them will even click on the same link multiple times in the same second,” the developer wrote in the post.

Enter the god of graves

So Iaso fought back with cleverness, building a tool called Anubis.

Anubis is a reverse proxy proof-of-work check that must be passed before requests are allowed to hit a Git server. It blocks bots but lets through browsers operated by humans.

The funny part: Anubis is the name of a god in Egyptian mythology who leads the dead to judgment.

“Anubis weighed your soul (heart) and if it was heavier than a feather, your heart got eaten and you, like, mega died,” Iaso told TechCrunch. If a web request passes the challenge and is determined to be human, a cute anime picture announces success. The drawing is “my take on anthropomorphizing Anubis,” says Iaso. If it’s a bot, the request gets denied.

The wryly named project has spread like the wind among the FOSS community. Iaso shared it on GitHub on March 19, and in just a few days, it collected 2,000 stars, 20 contributors, and 39 forks.

Vengeance as defense

The instant popularity of Anubis shows that Iaso’s pain is not unique. In fact, Venerandi shared story after story: