Think of the human. Write for humans!

One of our golden rules about SEO is that copy should always read well. When we were a smaller company we would get people from the finance department to proof read copy for us. We wanted to avoid the suggestion that the author had been too eager to stuff in keywords. These days each country as a department of Search Copywriters who are capable of writing great copy and also getting an appropriate percentage of keywords in.

Google and the other search engines do need to see keywords on a page. They're not psychic. However, one of the biggest crimes that SEO has inflicted on the web are pages which simply churn out a mantra of keywords. They look and read awful. They're written for robots and not people. They drive me mad.

David Cushman and Sean Warwick - both from the publishing industry - are calling SEO spam. Sean thinks writing with a 15% keyword density produces awful copy - and he's not wrong. David hates the way SEO 'tricks' you into clicking into off-topic content.

Of course, Sean's been given harsh advice. You don't need to write at 15% keyword density! Please. Please don't try and write at 15%.

David might be being a little harsh himself. Proper SEO (okay; let's use the word 'ethical') is about ensuring the search engine can see what your site is about. In fact, proper SEO includes basic tips like ensure web page content has an unique URL so communities and forums can link to it (as opposed to an Ajax style or single Flash URL which is used to display all the content). It tends to be the horrid combination of search spam and poor websites which result in Google searchers clicking themselves into a worthless experience.

Google's increasingly good at analysing copy. Do you want to bet that a website that's full of grammatical errors is a website giving off negative quality signals? Do you want to bet that a webpage with an unusually high density of a particular word is a webpage that's giving off negative quality signals? Pretty safe bets, huh?

Let's have a look at what Google's Adam Lasnik has offered for advice on this topic:
Our algorithms want to see something that's a happy medium cleanly between:

Extreme A: Not listing relevant terms at all on the page.

Extreme B: Focusing on increasing keyword density to the point that your English/Writing teacher would thwap you with a wooden ruler. Hard. Repeatedly.

And I'll let you in on a little algo secret: There is no single magic number. People who say "The guaranteed optimal keyword density is [x]%" would ideally meet the same fate from an angry English teacher. Or Googler or Webmaster.

And lastly, let me respectfully (and pleadingly) reiterate one key point: The fact that you *can* find sites that rank well for a particular keyword engaging in "keyword stuffing" is NOT evidence that such keyword stuffing is an effective SEO tool. I can also show you many sites that use the letter "Q" exactly three times that also rank well. And no, this is not an indication of a secret "jump the 'q'
rule."

Comments

Popular Posts