Google sends e-mail to site owners by the search console asking them to take away noindex directions from their robots.txt file.
The e-mail reads:
"Google has recognized that the robots.txt file in your website incorporates the unsupported rule" noindex. "This rule has by no means been formally supported by Google, and on September 1, 2019, it can now not work. Take a look at our assist middle to learn the way to dam the pages of the Google index. "
These notifications arrive simply weeks after Google formally canceled help for the noindex rule.
For now, Googlebot nonetheless obeys the noindex directive and can proceed to take action till September 1st. Web site house owners will then have to make use of an alternate.
Technically, as acknowledged within the e-mail, Google has by no means been pressured to help the noindex directive. That is an unofficial rule that Google adopted when it started to be extensively utilized by website house owners.
The shortage of a standardized algorithm for the robots.txt file is one other downside in itself – an issue that Google strives to resolve relentlessly.
Pending the institution of a typical checklist of guidelines, it’s in all probability finest to not rely solely on unofficial guidelines.
Listed below are different choices to stop the indexing of a web page:
- Meta noindex tag instantly within the HTML code of the web page
- HTTP 404 and 410 standing codes
- Password safety
- Prohibit within the robots.txt file
- Search Console URL Elimination Instrument