Dogpile, the meta search engine, has come out with a new tool that compares the major search engines. It shows that for most keyword queries the three major search engines, Google, Yahoo! and MSN Search, will produce different results with not too much overlap among them. It then goes on to suggest that this is an argument for using a meta search engine like Dogpile to make sure that important references are not missed.
This subject has started a lively debate on the Cre8asite Forums under the title Search Engines really are different. Bill Slawski, an Administrator there, has raised a question about the size of the samples used to prove the Dogpile assertions and whether they can be unbiased since they have a stake in the outcome of this analysis. There then is quite a discussion on how different search engines are coping with some popular queries. Does it all confirm the Dogpile view or not?
I see all this in a different light. The search engines reveal little of their methodology apart from saying what they don’t like in their Terms of Service. Each of them is its own ‘black box’ with its own magic. It’s very much like detergents. So it’s all become a marketing game. It’s all about perceptions of how each search engine performs relative to your own needs. What counts is the packaging now. You can buy the clean lines of a Google search page. Or you can buy the Swiss Army knife package of a Yahoo! or MSN Search portal with lots of other ways you may find the information you are looking for. You even now have Ask.com making itself available to a whole series of other portals as their own house brand.
So if it’s all marketing, do you accept the Dogpile way of doing things. It’s almost like the Consumer Reports Best Buy approach. Can we try to get any closer to what is really going on.
The following picture may help to explain what is really happening. You are aware of the Web with all the billions of web pages that you can explore with your browser. Some of them have fine images or flash animations that really are very informative. Search engines don’t see any of that richness. Imagine there’s a parallel universe that contains a binary file version of each web page that exists on the Web. Whenever you create a web page, a corresponding binary file is created in the parallel universe. Search engines don’t look at a web page as a human does. They examine this binary file version of each web page. Search engines then are comparing these binary files.
No one except for the search engine techies knows exactly how the search engine spider looks at the binary file. However people skilled in search engine optimization try to guess and make sure that the binary file version of their web page has text data prominently included in the binary file. Even experts aren’t exactly sure how to do this. The vast majority of web pages are not viewed in this light at all so the binary file is whatever it happens to be.
Different search engines are handling the binary files in different ways so it is not surprising that they come up with different web pages for searches for particular keywords. Interestingly in the Cre8asite thread, Bill Slawski compared searches for CSS. Since such pages are more likely to be written by technical experts, they are more likely to be able to prepare more search engine friendly binary files associated with their web pages. He found there was more search engine overlap among these web pages.
So if you’re searching a technical area where web page constructors know how to ensure the binary file will reflect the web page’s content, then search engines are more likely to all home in on the same ‘relevant’ web pages. Such technical areas are only a miniscule fraction of the total Web.
More often each search engine is grabbing its own guess at what are the most relevant binary files (and thus the associated web pages) for particular keyword searches. Depending on which area you’re dealing with, one search engine’s way of grabbing may work better than another’s. If you don’t have the time to think deeply about this, the Dogpile meta search engine may be a more robust way of ensuring that usually your keyword searches will work well for you. It’s like buying the three top detergents and using some of each in your wash. On average you should be better off than if you’d chosen the worst one for your wash. It may also work for keyword searches.
Tags: meta search engines, relevancy