Mountain View, California–Google Webmaster Tools now has the ability to offer its user more precise data for site variations, the company announced on its blog Monday. Over the past few months, the search engine has collected user feedback which asked to track indexed URLs for HTTPS sections.
Until now, Webmaster Tools did not separate such variants and all indexed pages were placed in a group. Because of the marked increase in HTTPS pages, or, pages which securely transmit data, the company responded, releasing a feature which allows users to see data on specific pages.
The New Google Webmaster Tools Tracking Feature
Not only are users able to differentiate information between HTTP and HTTPS pages, subdirectories can also be parsed. For instance, a site with an HTTP domain, and an HTTPs domain, will show separate data relative to each protocol, however, those pages must be separately verified.
To gain access to these reports, users must verify all variants of their site in Webmaster Tools, which include: www., non-www., HTTPS, subdirectories, and subdomains. The search engine encourages users to ensure their preferred domains and canonical links are configured correctly.
Sitemap and Robot.txt Files
Uploaded sitemaps ought to be revisited as well. In the announcement, Google writes, “if you wish to submit a Sitemap, you will need to do so for the preferred variant of your website, using the corresponding URLs. Robots.txt files are also read separately for each protocol and hostname.”
The ability to more precisely parse data will be of great value to site owners, allowing them to discover visitor behaviors weigh information about specific pages. The company recently stated it would be revisiting its policy of “keyword not provided” and has warned sites about “thin content”.
What do you think of the new tracking feature and how much will you use it?