Why the number of indexed pages decreased for a "site:" search

5,664

Solution 1

First, the Google Webmaster Tools will tell you how many pages are in the index.

Google may have dropped pages from its index from multiple reasons, which they will NOT tell you, but the most probable is duplication or very similar content.

If you feel that Google is not indexing enough content then you should check the webmaster tools under site-parameters which you can specify how Google treats each parameter in your URLs. This will let you tell Google which parameter leads to different content or even different arrangements of the same content. If you have not been there in a while, it changed less than a month ago and there are much possibilities than before.

You can also give them a text or XML sitemap in case your URLs are too hard to crawl because of the complexity of parameters. If you do, Google will tell you how many URLs are in your sitemap and how many of those it indexed (Google does not always index all URLs in the Sitemap).

The last possibility is an error on your side such as incorrect canonicals, robots.txt or noindex meta.

Solution 2

The site: indiciator is not a reliable operator. Google won't de-index your data, but probably just choose not to show it in the SERPs (for a number of reasons that can't be determined due to the vagueness of the question) - though as I mentioned in my comment, with such a large site of 'unique' content the quality may be fairly low.

Share:
5,664

Related videos on Youtube

Parthipan Paramasivam
Author by

Parthipan Paramasivam

Updated on September 18, 2022

Comments

  • Parthipan Paramasivam
    Parthipan Paramasivam over 1 year

    Recently I created a website with hundreds of thousands of unique pages. A week ago, Google had indexed about 250,000 pages. It has now decreased to 90k URLs (using a site:domain.com query).

    What should I do to have Google index all the pages again? Why did all these pages get de-indexed from Google?.