My oldest blog was once titled “The Rants”. I like to rant. Here’s a rant. This is a personal bugbear rant. We all have bugbears, sometimes they are justified and sometimes they’re not. Let me know how I do.
I argue that cloaking is a very specific activity.
Secondly, cloaking is a deliberate attempt to trick the search engines. Cloaking is malicious. It is sneaky. There are some techniques where you use server side detection to trigger a redirect or perhaps strip a season.
This year at SES London I flapped my arms around when David Naylor tried suggesting to someone in our Site Clinic audience that she try cloaking. In this instance David was not suggesting a black hat technique. He really was trying to advise the woman on one way to handle having multiple sites for different geographic locations, have them exactly the same but still not worry about duplicate content. I also remember disagreeing with Ammon Johns during a NMA roundtable in that there can be such a thing as “ethical cloaking”. I think Ammon was also citing geographic detection as a valid example. Once again I argued that that is geographic detection and not cloaking.
There is a debate as to whether Website Optimiser might be interpreted as hidden text. Could you use it to “test” an <h1> tag which normally says “Andrew Girdwood” with one that said “ARHG!” and never stop the test? Well. Yes. You could. I wonder if Google will receive a flood of “I was multi-variant testing!” defences after sites are punished for spam.