At the end of July, Google announced that they would no longer identify Web pages in the Supplemental Index. Although such pages have reduced visibility in Google’s keyword search results, Google felt that the Supplemental label was attracting undue attention. In any case, there would be greater efforts to reduce the disparity between the Google regular index and the Google supplemental index.
Danny Sullivan and many others expressed concern about this move. However there was still a loophole as Danny mentioned:
At the moment, if you want to force the labels to show up, doing a search for [site:domain/&] is a tip that came out of WebmasterWorld this week, and that still seems to be still working.
(Tip of the Hat to Halfdeck who mentioned this in a Cre8asite Forums discussion on this topic).
It now appears that the loophole has been closed. A Google search for site:www.domain.com/& shows exactly the same number of web pages as the recommended Google search for site:www.domain.com/. In other words, both show the total number of web pages that Google has indexed in both the regular index and the supplemental index.
For the moment, the other trick that Halfdeck mentioned seems still to be working. A search for site:www.domain.com/* still seems to give the number of web pages that are in the regular index. Comparing this with the total number of web pages indicated by the regular site:www.domain.com/ can give an indication of what percentage of the total web pages on the domain are in the regular index. For established websites this is often above 75%. For new websites on the other hand, less than 20% of web pages seem to be in the regular index if this test is valid.
Is this a reasonable estimate? If so, how long will Google leave this peephole open? Time will tell.
Related:
Supplemental Result in Google – Hell or Help
Google Supplemental Label Out, PageRank Next?
Thats a really nice find. I don’t know if I care much though about the supplemental index. I tend to just rely on my logs to determine if my pages are important enough for google. I mean honestly if I am not in the supplemental index but on page 10 of the serps is there really a difference?
Thanks for the comment, webprofessor. It certainly is the final result that counts. However the % of web pages in the regular index gives a quick alert relative to similar websites if you have a global problem or not.
I’ve only been using the site:domain/& and site:domain/* queries with trepidation but they appear to be working for me. It may be that, if Google is closing down the reporting, they are doing on a data center-by-data center basis or perhaps through a recrawl of the Web.
Simply doing away with visible indications of their Web Apartheid is not acceptable. Google needs to treat all Supplemental Pages equally with all Main Web Index pages — let them rank competitively for queries, pass link anchor text, and pass PageRank.
Of course, Google is always welcome to stop allowing sites to pass link anchor text. It’s a poor measurement of relevance that has been way overabused by the SEO industry and spammers alike.
I agree, Michael, but I don’t think Google sees how its bottom line will be better off if it goes along with our wishes.
Hmmm… For one new site of about 100 pages, I am seeing 35 pages show up with site:http://domain.com. An exact copy of the top eight of those 35 shows in a search for site:http://domain.com/*. Using site:http://domain.com/& shows 33 pages, in a slightly different order than the other two searches.
I’m not sure whether as Michael points out, different data centres may produce different results. In my searches, the site:domain.com/& search always produces the same number of web pages as site:domain.com/ search. I did not look at the ordering of those web pages.