Google Starts Telling Website Owners to Remove Noindex Directives from Robots.txt Files

Google is reaching out to webmasters through the Search Console, directing them to remove noindex commands from their robots.txt files…

Search giant Google is emailing website owners and managers about an upcoming change to how it crawls sites.

Google Recommends Site Owners Remove Robots.txt Noindex Directives

The request is part of a longstanding practice or rather, policy, which has been somewhat misconstrued. Google recently canceled support for its noindex rule just weeks ago and is now sending an email, which reads:

“Google has identified that your site’s robots.txt file contains the unsupported rule ‘noindex’. This rule was never officially supported by Google and on September 1, 2019 it will stop working. Please see our help center to learn how to block pages from the Google index.”

Meanwhile, the Googlebot will continue to obey noindex directives in robot.txt files until September 1st.

What’s interesting, as the email states, Google never officially supported the command. Instead, the search company adopted the practice when it became more and more popular among websites.

Website owners can use alternatives such as a noindex meta tag right within the HTML code of a page. Or, use 404 and 410 status codes. Other alternatives are installing password protections, disallowing in robots.txt, and using the Search Console Removal URL tool.

Owen E. Richason IV

Covers social media, apps, search and like news. History buff, movie and theme park lover. Blessed dad and husband. Owen is also a musician and is the founder of Groove Modes.