The Fastest Way to Deindex Pages on Google (According to John Mueller)

— January 13, 2017

We recently came across an interesting conversation on the BigSEO subreddit. A user was trying to deindex a large number of product pages on their site, but three weeks later they’re still showing in Google’s search results. Even their Search Console shows no changes after the deindex.


Here’s the catch: the user had deindexed pages for the same products on another of their sites, and those were deindexed immediately.


“Any ideas why Google has ignored my noindex tags?” the user asked.


Another user suggested OP remove the pages from the sitemap and resubmit it. In response to this particular suggestion, a familiar face intervened…


The Google logo in the bottom left, linked to a rectangle with drawings of webpages in the upper right.


How to Efficiently Deindex Pages


None other than John Mueller, Webmaster Trends Analyst at Google, showed up to answer the question. Here’s what he had to say:


“Having them in a sitemap with a lastmod date reflecting the change in robots meta tag values would be better than removing them from the sitemap file.”


Boom. If you want to deindex your pages from Google, you’ve got the answer straight from the mouth of Google.



  1. Change to meta robots noindex
  2. Use the lastmod date to reflect the change
  3. Update your XML sitemap

Usually the fastest way to remove a page from Google’s index is with the URL removal tool through Google Search Console. But if you are dealing with a larger site with lots of URLs, the tool isn’t an efficient solution. In these cases, John Mueller recommends keeping the URLs you’d like to deindex in your XML sitemap with the lastmod date reflecting the change in your meta robots tag values.


Here’s a screenshot of the thread and Mueller’s response:


Googles John Mueller weighed in on a Reddit thread on how best to deindex pages.


Clearing Up Robots.txt Confusion


John Mueller also weighed in on using both your robots.txt file and meta robots tags to block pages from being indexed. He indicated that in such cases, Google won’t be able to see the noindex tag if it’s blocked though the robots.txt file. He also added that using a robots.txt won’t remove pages from Google’s index.


John Mueller from Google talks robots.txt file and deindexing pages.


You can read the whole thread here.

Digital & Social Articles on Business 2 Community

(51)