In response to a question, Google‘s trend analyst, John Mueller, says that there’s no benchmark for what is considered an optimal crawl budget.
Crawl budget refers to the number of pages Googlebot crawls and indexes when on a website within a specific timeframe. It’s usually based on the site’s speed and user demand.
But why is it essential for Search Engine Optimization?
It’s simple: if Google doesn’t index a page, it won’t rank for anything on the search engine. That means if your number of pages exceeds your website’s crawl budget, some pages on your site won’t be indexed.
Similarly, a higher budget allocation can keep the popular content on your site fresh, while preventing the older ones from becoming stale.
This raises an essential question:
Does Google Have an Ideal Crawl Budget?
It all began on a Reddit thread when an SEO asked whether the Googlebot has an ideal percentage of pages that should crawl daily.
Here’s the full question:
“While everyone talks about the crawl budget, I haven’t found a specific cutoff or a range for this. Like what should be the minimum percentage of pages out of total (or total changes made every day) should be crawled by GoogleBot every day to keep the content fresh?”
The SEO admitted that static/variable content on a website could lead to variations. But, he or she wanted to know how to benchmark the crawl budget.
In response to the question, Mueller wrote: “There’s no number.” In other words, you don’t have to aim for an ideal number when trying to boost your website’s crawl budget.
Of course, that’s not to say that there’s no benefit to improving your crawl budget. One of the best ways to do this is to limit the number of low-value-add URLs on your website.
Here are other best practices to help maximize the number of pages that Googlebot crawls:
- Improving your site’s page speed
- Using internal links
- Using a flat website architecture
- Avoiding pages that have no internal or external links
- Limiting duplicate content
Also, site owners should monitor the crawl errors report in Search Console, and keep the server errors as low as possible.
Comments (0)
Least Recent