Following a post from Matt Cutts there has been quite a bit of industry nose along the lines of "Google cracking down on search results in their index".
Is it as clear cut as that? Sites like Kelkoo and Dealtime do well in Europe on their organic search results. These pages are, however, essentially search results. Does Google wish Kelkoo to remove their category pages from Googlebot’s gaze?
It’s also not uncommon for CMS (content management systems) to work through hard-coded search results. FAST, Endeca or even some Venda systems can do this. This Internet Retailer article describes the success Wal-Mart has had with Endeca (a product I certainly recommend) but if you pause to examine Wal-Mart’s learning toys page then you’re pausing to look at a set of search results which match the learning toys matrix in Wal-Mart’s database.
These are not the sort of search results that (I imagine) Google want to exclude from their index.
Let’s look for a rule of thumb. I think it’s more likely that the search results which Google wishes to exclude are those generated by forms. Users enter some free text and are returned a set of results to match that. Pages like this are typically known as “the invisible web” as, safe behind their web forms, they can often be invisible to search engines. If your web form creates unique URLs for each search (?search=key+words) then you can include these URLs in your Sitemap XML.
I can see why Google would want to exclude search results from their index. Let’s not drive searchers in circles. I just think this is a greyer area than many people appreciate.