Mountain View, California—Google Webmaster Tools is becoming a bit more comprehensive, with new features which will greatly expand the scope of the application. The search engine is rolling out a more robust user tools which will help site owners to take more control over their cyber properties.
With every future Google Webmaster Tools login, site owners will find they have access to more particular bits of information which will help them in determining the source of bad or low quality links. A change which comes right after the search engine enacted new ranking factors.
Changes In the Google Webmaster Tools Dashboard
Now, when the search engine sends a message to a site owner, warning their property has bad links or “defaced content”, there will be example URLs included with the warning. This will give users a substantially better idea of where to look in their Google Webmaster Tools sitemap, pinpointing areas of concern.
“It’s much better than I was a few months ago and we’ll really looking for ways to provide even more guidance and a little more transparency so webmasters get a better idea where to look,” Matt Cutts, the search engine’s Chief Webspam Engineer said in a recent video message.
The changes have actually been enacted because of user feedback about the limitations of their Google Webmaster Tools account, which, until now, only stated it detected the presence of defaced content and bad inbound links.
In the last several weeks, Google has been focusing more on user experience from the backend. One such change is the ability to make changes over all Google properties in the Plus dashboard.
Limitations to the New Google Webmaster Tools Warnings
Cutts also states in the video update that while message will cite example URLs, they won’t necessarily include every single instance. The reasons, according to Cutts, are twofold: one, it might inadvertently help spammers with endrun strategies around the system, and two, it could very well mean sending site owners messages so large as to be unwieldy.
Google has taken manual actions on sites with literally millions of pages, but only on a few particular pages of those super large sites. The warning message following the manual action didn’t include much detail, so those sites were left to do exhaustive research in order to locate problem areas.
The search engine also recently released an updated version of one of its most powerful algorithms, Penguin 2.0. Google has also been busy identifying bad neighbor sites and links sellers, taking manual actions against those properties.