Google is reaching out to webmasters through the Search Console, directing them to remove noindex commands from their robots.txt files…
Google Recommends Site Owners Remove Robots.txt Noindex Directives
The request is part of a longstanding practice or rather, policy, which has been somewhat misconstrued. Google recently canceled support for its noindex rule just weeks ago and is now sending an email, which reads:
“Google has identified that your site’s robots.txt file contains the unsupported rule ‘noindex’. This rule was never officially supported by Google and on September 1, 2019 it will stop working. Please see our help center to learn how to block pages from the Google index.”
Meanwhile, the Googlebot will continue to obey noindex directives in robot.txt files until September 1st.
What’s interesting, as the email states, Google never officially supported the command. Instead, the search company adopted the practice when it became more and more popular among websites.
Website owners can use alternatives such as a noindex meta tag right within the HTML code of a page. Or, use 404 and 410 status codes. Other alternatives are installing password protections, disallowing in robots.txt, and using the Search Console Removal URL tool.