The Bing web crawler efficiency is getting better all the time, according to Principal Program Manager for Webmaster Tools, Fabrice Canel…
In a new blog post, Fabrice Canel, Principal Program Manager for Webmaster Tools, writes his team is focusing on improving crawl efficiency.
“To measure how smart our crawler is, we measure bingbot crawl efficiency. The crawl efficiency is how often we crawl and discover new and fresh content per page crawled. Our crawl efficiency north star is to crawl an URL only when the content has been added (URL not crawled before), updated (fresh on-page context or useful outbound links) . The more we crawl duplicated, unchanged content, the lower our Crawl Efficiency metric is.”
For those unfamiliar, BingBot is a “spider” which crawls the web looking for new and updated content. The bot crawls billions of pages per day, according to Bing. But, it must strike a balance between doing it too often or not often enough.
Bing Web Crawler Efficiency Improving, Company Says
The update serves as a follow-up to a previous talk at SMX Advanced in June. During that speech, Canel said his team would embark on an 18-month effort to improve BingBot’s efficiency. To do so, it must find a way to accommodate both site owners and the search engine:
“The challenge we face, is how to model the bingbot algorithms based on both what a webmaster wants for their specific site, the frequency in which content is added or updated, and how to do this at scale.”
Canel says Chen Lu, the engineer lead for the crawl team, will continue to update the public about their progress.