I've been in a few discussions lately on the topic of the Google search algorithm - specifically, with people who suspect that it is geared toward promoting the "big" sites and/or making sure that the smaller sites are hard to find.
While Google keeps its ranking algorithm a trade secret (which is understandable, though it feeds the panic and suspicion), the general notion is that it ranks "big" sites first - a site that's already popular with a lot of people, by virtue of clicking it in the search results or making links to it from other sites, is likely to be more relevant to the search phrase than one that no-one seems to be paying attention to.
The problem is: that makes it difficult for a new site or page that is more closely related to a given topic to overcome the "power" that has been built up by existing pages on the same topic.
Hence, a one-paragraph blurb on Wikipedia far outranks an entire site that contains comprehensive information on the very same topic - not because it's a better source of information, but because it's been around longer, so more people know about it, have clicked it in the search results, have linked to it from their own pages.
As such, a search engine that relies upon (past) popularity and relevancy will, over time, serve to maintain the prominence of less informative or authoritative sources of information. I'm stuck for an answer to the problem, short of blowing out the buffers to reset the balance of power from time to time.
On the upside, Google's practice has largely overcome the problem of SEO spamming, which was a widespread problem in the era when search engines used the content of a page to determine its meaning - and companies that wanted to attract attention would lace their pages with popular search terms to mislead users into visiting their sites when they were looking for something else. Not that the practice is entirely dead, but it's much less prevalent than it once was.