I've used google since it was google.stanford.edu, and it's clear to me the results have suffered. My feeling is that two of the problems are SEO and feedback effects of google's own popularity.
SEO: When you cut through all the BS, the entire goal here is to make a less good match come first. And it works (sorta). Just consider crap sites like Experts Exchange that we've only learned about because they pollute many searches.
Feedback effect: Thanks to google, less people do less collecting of good links. Why bother when you can google for it? So there's less good information for google to use in ranking links. Bear in mind that when google started, nearly every home page had a long list of links to all the pages that particular user liked and frequently used. I used to have one; I've long since deleted it; my blog has some outgoing links that I like, but relatively few. If I twittered, I'd probably post a lot of outgoing links, but of dubious value; there's no gardening of just the perfect page of 100 links going on anymore.
(I think this also partially explains why some (generally more specialized, so less effected by other things) results feel dated -- legacy links that are still hanging around from days when links were still used that way.)
Feedback effect: Thanks to google, ten sites tend to be more important than any other sites on any given topic. This results in certain sites becoming increasingly important. Wikipedia is the chief example here. Why is there only one Wikipedia and not a dozen? Chiefly because it's gotten all the google juice. If you want your wiki article on foo to show up in google, you naturally write it on Wikipedia, not Fooipedia. The result here is that all google searches feel increasingly the same -- of course Wikipedia is always in the top ten, or maybe something like Stack Overflow for a technical search.
----
So, these days, if I don't see something interesting in the top ten, I often click on the link to page 10 (or 20, or 100) of the results. Often more interesting. For example, google for "mashed potatos".
Top 10 results: "Perfect mashed potatoes" (SEO), allrecipies.com (always in top 10 for any recipe search), foodnetwork.com, Wikipedia, about.com, nytimes, etc. Pictures of mashed potatos. All generic and useless.
Page ten results: Dairy-free mashed potatoes. _Potato_ free mashed potatos! Caramelized Onion Horseradish Red Mashed Potatoes! A poem about eating them. At least marginally more interesting and quirky. What I would have expected out of google circa 1997.
I'm not quite sure what exactly it is you would want from them. http://www.google.com/search?q=mashed+potatos has a list of recipes for the query. Seems like exactly what you would respond to the question "What do you know about mashed potatos?" If I changes it to "mashed potatoes" like suggested, I get the rest of the results you mentioned. Again, this is exactly the kind of stuff you wanted.
Now, if you want something "quirky", why are you searching for a generic term? What kind of "useful" result do you want from a search on mashed potatoes? If you give them a crappy search query, they should be giving you as generic of results as possible.
One thing I've found is that if you are looking for something specific, don't search for something generic. If you wanted something "quirky", why didn't you do "mashed potatoes quirky"? Then you get a restaurant that features mashed potatoes heavily in their recipes, a carmelized onion mashed potato recipe, a mashed potatoes festival, several more "interesting" recipes, and a book called "Grinning in His Mashed Potatoes".
It sounds to me like the results have improved, not gotten worse, if you aren't getting a poem about mashed potatoes on the first page of search results for just "mashed potatoes".
I really wish Google had the option to blacklist certain sites from the results, such as EE. Maybe they could even use the data of what people are blacklisting.
A lot of non-technical people lately seem to type whole questions and sentences in the search box. Syntax analysis is hard for them, it seems. But Google now encourages this and levels everything down.
Somewhat perversely, I've specifically tested that syntax on a few occasions, and have had surprisingly good results as compared with a classic search.
So, do you think it's time to start adding a top links and help them out?
I have started tailoring my searches in odd ways to help them out. Ex: Adding a the year when I want current results. But, without useful links it's all GIGO.
I think Google knows about this exact problem. They know that the links people are sharing on Facebook and Twitter have supplanted the site-site links, and are therefore much more important to search quality. To this end, the agreement to include 'real-time' search data from Twitter is partially a misdirection, since the importance of having the data about shared links far exceeds the value of someones 140-character blurb.
A related point: whoever first owns the data from all link aggregators (digg, reddit, mixx, etc) and all URL shorteners (bit.ly, tinyurl, ad nauseum), and weighs those results more heavily is going to have an awesome search engine... albeit better for entertainment than productivity.
whoever first owns the data from all link aggregators (digg, reddit, mixx, etc) and all URL shorteners (bit.ly, tinyurl, ad nauseum), and weighs those results more heavily is going to have an awesome search engine... albeit better for entertainment than productivity.
i will humbly disagree. i think folks who browse the web are different from those who search the web. search is what gets you the most relevant results, therefore more opportunity for ad money.
A related point: whoever first owns the data from all link aggregators (digg, reddit, mixx, etc) and all URL shorteners (bit.ly, tinyurl, ad nauseum), and weighs those results more heavily is going to have an awesome search engine... albeit better for entertainment than productivity.
Except I'd never find the obscure stuff that Google helps me find every day. I don't think a lot of links to manuals,mailing lists, etc show up on any of the sources you mention.
SEO: When you cut through all the BS, the entire goal here is to make a less good match come first.
That is not the entire purpose of SEO. There's good sites out there that don't provide their content in a way that can be indexed by spiders. SEO often solves that. There certainly are plenty of people making bad websites and trying to make them rank, it's Google's job to weed out the useless information.
People still collect links (shameless self promotion: http://internetmindmap.com ), they still have huge blogrolls, there are human powered search engines, a vast amount of directories for every imaginable niche...
Google doesn't need the perfect page of 100 links and I doubt it ever did.
Your mashed potatos example does not make sense. Google gave you generic info for your generic search query. How is that bad?
Now the fact that certain sites dominate a very wide range of search queries, is an interesting point. Personally, I would just add a sidebar or something similar, to be occupied by the "staple" sites, such as wikipedia, about.com etc.
SEO: When you cut through all the BS, the entire goal here is to make a less good match come first. And it works (sorta). Just consider crap sites like Experts Exchange that we've only learned about because they pollute many searches.
Feedback effect: Thanks to google, less people do less collecting of good links. Why bother when you can google for it? So there's less good information for google to use in ranking links. Bear in mind that when google started, nearly every home page had a long list of links to all the pages that particular user liked and frequently used. I used to have one; I've long since deleted it; my blog has some outgoing links that I like, but relatively few. If I twittered, I'd probably post a lot of outgoing links, but of dubious value; there's no gardening of just the perfect page of 100 links going on anymore.
(I think this also partially explains why some (generally more specialized, so less effected by other things) results feel dated -- legacy links that are still hanging around from days when links were still used that way.)
Feedback effect: Thanks to google, ten sites tend to be more important than any other sites on any given topic. This results in certain sites becoming increasingly important. Wikipedia is the chief example here. Why is there only one Wikipedia and not a dozen? Chiefly because it's gotten all the google juice. If you want your wiki article on foo to show up in google, you naturally write it on Wikipedia, not Fooipedia. The result here is that all google searches feel increasingly the same -- of course Wikipedia is always in the top ten, or maybe something like Stack Overflow for a technical search.
----
So, these days, if I don't see something interesting in the top ten, I often click on the link to page 10 (or 20, or 100) of the results. Often more interesting. For example, google for "mashed potatos".
Top 10 results: "Perfect mashed potatoes" (SEO), allrecipies.com (always in top 10 for any recipe search), foodnetwork.com, Wikipedia, about.com, nytimes, etc. Pictures of mashed potatos. All generic and useless.
Page ten results: Dairy-free mashed potatoes. _Potato_ free mashed potatos! Caramelized Onion Horseradish Red Mashed Potatoes! A poem about eating them. At least marginally more interesting and quirky. What I would have expected out of google circa 1997.