Are AI Crawlers Threatening Website Performance, SEO, and Bandwidth Costs?

There has been an increase in AI crawlers on different websites and these bots are affecting the search ranking and speed of those websites These AI crawlers are from companies like Anthropic, OpenAI, and Amazon, and are crawling on websites to gather data for AI models. For instance, SourceHut has blocked many cloud providers like Microsoft Azure and Google Cloud because they were sending too much bot traffic to websites.According to data from Vercel, OpenAI’s GPTBot made 569 million bot requests in a month while Anthropic’s Claude made 370 million requests. Around 20% of Google’s search crawler volume is because of AI crawlers. DoubleVerify found that there was an 86% increase in general invalid traffic (GIVT) in late 2024 because of AI crawlers and 16% of these bots were from ClaudeBot, GPTBot, and AppleBot.Chart: DoubleverifyRead the Docs project reported that they have reduced their daily traffic from 800GB to 200GB by blocking those AI crawlers which has saved them around $1500 per month. AI crawlers are different from traditional crawlers in their depth and frequency and consume more resources by revisiting the same pages every few hours. SEO professionals and website owners need to manage AI crawlers while maintaining visibility in search results. Check server logs and bandwidth spikes for any unusual activities and monitor high traffic to resource-heavy pages. Using robots.txt and Cloudflare’s AI Labyrinth can also help in blocking any unauthorized bot traffic on websites.Read next: • American Support for the TikTok Ban Hits New Low, Study Claims• YouTube Updates Shorts View Count To Capture Every Play While Testing Variable Notification Frequency For Better Engagement

Scroll to Top