Why do SEOers hate options?
Why is it that whenever a search engine offers up a control choice to the SEO community that the SEO community turns around to bite the search engine?
We've seen this already with nofollow. This is an optional relational element for links. SEOers hate it. They would rather there was no way to signal "vote" or "no vote" when linking to another site.
I don't understand why. Surely this empowers webmasters and SEOers alike with more control, choice and power?
Actually, that's a lie. I do understand why. One day "nofollow" could happen to them. It's like speed cameras. Even motorists who insist they never speed and support road safety can be strongly against speed cameras (for [reason xxx]).
We're seeing it again. Yahoo's introduced a class (it's not a tag) called "robots-nocontent" which instructs Yahoo! Slurp to ignore that content.
The response from the 'blogosphere' seems to be "isn't this the job of the search engine to work out". Fair enough. I don't remember anyone saying that when MSN, Yahoo and then Google introduced "NoODP" as a meta tag.
Recently the search engines adding a sitemap XML auto-support option to the Robots Exclusion Protocol. That blows the standard out of the water. Did anyone complain? No.
It seems to be that whenever the search engines seem to be making the art-and-science of SEO more widespread that the tight-nit SEO community objects.
I don't see it like that. I think the more options we have then the more there is need for specialised SEO services to help clients weight up the options.
Go on then, in your own words; why is "robots-nocontent" a bad idea?
We've seen this already with nofollow. This is an optional relational element for links. SEOers hate it. They would rather there was no way to signal "vote" or "no vote" when linking to another site.
I don't understand why. Surely this empowers webmasters and SEOers alike with more control, choice and power?
Actually, that's a lie. I do understand why. One day "nofollow" could happen to them. It's like speed cameras. Even motorists who insist they never speed and support road safety can be strongly against speed cameras (for [reason xxx]).
We're seeing it again. Yahoo's introduced a class (it's not a tag) called "robots-nocontent" which instructs Yahoo! Slurp to ignore that content.
- You're a finance site, need to put a hefty disclaimer on each page but want to avoid duplicate content - this option is for you
- You have a multi-lingual site with a brand message in English on every page but don't want to loose the language targeting - this option is for you
- You've a hefty cross-network navigation bar which has no relevance to your actual site - this option is for you
- You take in unmoderated user generated content - this option is for you
The response from the 'blogosphere' seems to be "isn't this the job of the search engine to work out". Fair enough. I don't remember anyone saying that when MSN, Yahoo and then Google introduced "NoODP" as a meta tag.
Recently the search engines adding a sitemap XML auto-support option to the Robots Exclusion Protocol. That blows the standard out of the water. Did anyone complain? No.
It seems to be that whenever the search engines seem to be making the art-and-science of SEO more widespread that the tight-nit SEO community objects.
I don't see it like that. I think the more options we have then the more there is need for specialised SEO services to help clients weight up the options.
Go on then, in your own words; why is "robots-nocontent" a bad idea?
Comments