Gary Illyes, of Crawling and Indexing teams, explains the Googlebot crawl budget, elaborating on the differences between crawl rate and crawl demand…
On the official Google Webmaster Central Blog, Gary Illyes wrote an article titled, “What Crawl Budget Means for Googlebot.” The post clarifies a few characteristics of crawl budget, going more in-depth about crawl rate limit, crawl demand, and factors affecting crawl budget.
Googlebot Crawl Budget Clarified by Googler Illyes
“…we’d like to emphasize that crawl budget, as described below, is not something most publishers have to worry about. If new pages tend to be crawled the same day they’re published, crawl budget is not something webmasters need to focus on. Likewise, if a site has fewer than a few thousand URLs, most of the time it will be crawled efficiently,” Illyes explains.
Furthermore, Illyes writes Googlebot is “designed to be a good citizen of the web,” meaning although crawling is its top priority, the bot does not degrade user experience. Googlebot contains a maximum fetching rate for sites or “crawl rate limit.” This limit increases if a website responds well but if the site slows, the rate decreases with less crawls.
Crawl demand isn’t necessarily determined by crawl limit. So, a site might not reach its limit but won’t always receive more crawls, “…if there’s no demand from indexing, there will be low activity from Googlebot.” Illyes also explains sites with low quality content negatively affects crawling and indexing. This affirms a recent Google ranking factors study, showing content is gaining while links are losing in importance.
Thanks for taking a few moments of your time to read this; if you like this information, please share it with your friends!