Friday, December 30, 2005

2006 Searches

At this time of year everyone is discussing what the next big search stories and innovations might be. It's great fun reading all the speculation. Of course, some of us have a much better idea than others. The search engines themselves are offering little peaks here and there (or not so little in some cases).

It took a visit from Google for me to crawl through the snow, travel on two buses and make it to work today. As is the case with everyone else in such a privileged position there's nothing more that can be legally said.

I think John Battelle's 2006 predictions have been especially good. John's certainly a man in the know, too.

Thursday, December 29, 2005

Local Live

If Microsoft can expand their Local Live's Bird Eye View to the same or greater range than Google Local then they'll have a better product.

It's a big if, though. A big if on the day that Galileo launches.

Tuesday, December 13, 2005

Froogle becomes Google Base

Pages on Froogle UK talk about Google Base. The two terms are used synonymously in various help files.

This is not unexpected. Google Base hints heavily at the tie in with Froogle. It makes sense to only use one feed to supply Google with product information rather than two. On the other hand, Google has invested a little into the Froogle brand and I suspect we'll not see the end of it for a while. The trend is away from sub-brands though. We lost Urchin to the far less catchy "Google Analytics" (though most people call it Urchin anyway).

I had originally titled this post "Google's left hand versus Google's right hand" in reference to the conflicts in the help file and also because of my geeky amusement to see Gmail (sorry, I'm in the UK and should therefore say Google Mail) spam block a password reset from Orkut. Google should know not to block Google.

Monday, November 28, 2005

WebmasterWorld Grumpy Bear

I'm being a grumpy bear about the WebmasterWorld anti-robots/anti-search situation.

WebmasterWorld is a large and popular forum where webmasters can discuss site issues. Important sections on the site are the SEO and search engine threads.

The forum has recently banned search engines from indexing it and, as a result, it has been dumped from Google and the others.

Why would a forum which needs it web traffic do this? We're told that they had to. I don't like that one bit - it sends an anti-search and anti-SEO message.

WebmasterWorld is highly respected. I would say many web savvy clients check out WebmasterWorld (and, of course, many small SEO firms camp out there). I personally believe the quality of the forum has nose dived; it's now full of newbies who sound like a broken record and ask the same questions again and again, it's peppered with veterans who no longer care to respond to the broken record.

This isn't the search engine problem though. We're told that WebmasterWorld is plagued by "bad bots". These are automated user agents (like search engine spiders) which crawl the site and take the content. There is so much of this activity that the forum's web servers struggle to keep up.

The webmaster's response to these bad bots at WebmasterWorld is to block all bots via the robots.txt "robots exclusion protocol".

This is foolish. Bad bots do not obey this protocol. This change will only effect good bots.

At WebmasterWorld they've made other attempts to block the bad bots - they've changed the site so that you must now log in before you can view any real data. This is the draconian tactic to take if everything else fails. The robots.txt is just stupid.

Normally, I wouldn't care less. WebmasterWorld can stab themselves in the foot if they wish.... except the site has the SEO profile that puts it in the limelight somewhat. I believe the site's actions give out the wrong message.

It is possible to deal with bad bots without blocking your site from search engines. It is. Further more, blocking your site from search engines does not stop bad bots (though it makes the site harder to find!)

We're told that WebmasterWorld has tried a whole host of complicated and thorough defenses and that they all failed. I don't get it. If the forum had those resources to hand then they certainly have the resources to re-design the site to be less of a bad bot target. For example, at times of heavy load the site could ask for user input, a captcha or a question. I came up with that crazy theory in about 2 seconds of thought.

Newspapers are hugely scraped by bad bots and they deal with the issues WebmasterWorld has caved in on.

WebmasterWorld have cited impressive figures to support their scraping problem - but I've never found a single scraped entry from WebmasterWorld. Do thousands of people really download the site for their own personal desktop edition? An edition which would require a desktop search (rather than Google) to search and which would need to be kept up to date? Meh.

We're told this is a test. I predict we'll see that WebmasterWorld no longer has the pull to attract the SEO experts (especially after this) so I don't think it'll become an exclusive oasis. I suspect we'll see it back in the search engines when they retract their robots.txt change. I imagine there will be complaints that the robots.txt rules should be changed (all a red herring since bad bots ignore it).

Saturday, November 26, 2005

Advertise on this Site

Last night I mucked around with the new "Advertise on this Site" option on AdSense. I did so while wearing the webmaster hat for my main hobby site. The "Advertise on this Site" link doesn't ever really appear there since I've one publisher who already cleverly targets image banners there for me.

As already noted by the AdSense community the "Advertise on this Site" option is horribly bare bones. We're encouraged not to set up separate AdSense accounts for every site but keep them all together under one (this blog, for example, is part of the same account).

The "this Site" concept is broken. Most of us have a number of sites. If you click on the "Advertise on this Site" option on Search Commands then you get the page I designed for GameWyrd. It's odd Google let the feature go live when it is badly targeted.

I was frustrated by the lack of space too. I wanted to explain how GameWyrd uses AdSense on my page. I prefer to sell cheap banners to community publishers as that's what I want to appear on the site. Google AdSense fills any gaps I have in the banner rotation. There's simply not enough character space to say this.

I'm really far from impressed with the feature so far.

Friday, November 25, 2005

Custom error pages, 404 header response and Google sitemap XML

A custom 404 page is important. You don't want to loose visitors off your site and an user friendly error page, one which quickly re-enpowers the user, is key here.

It's also key to ensure that your "page not found" message actually returns the 404 header response. At the end of 2004 in Vegas, Yahoo banged on and on about this in a presentation to SEOs and webmasters.

The 404 response is also key if you wish to make use of the reporting capabilities of Google's sitemap XML project.

The issue is that if you use Apache and an ErrorDocument command that you'll wind up with a 302 response - that might actually be correct as the server is redirecting the user agent to the custom 404 page.

An .htaccess file might look like:

ErrorDocument 404
The killer catch is that it's not possible to tweak Apache or use PHP to actually get requests to non-pages ( for example) to actually return the 404 header.

The good news is that there is a compromise which Google accepts - and this compromise is good enough to get your sitemap XML verification file accepted.

With PHP you can have your custom error page issue a 404 header before showing you any HTML. That's fairly easy:
header("HTTP/1.0 404 Page not found");
at the start of the page in a chunk of PHP.

If you examine the header of the custom error page (404e.php in my example) then that does correctly return the 404 header.

Google is following the 302 to the custom error page and then checking the header off that error page. This is really what you'd expect Google to do as it's the only way to deal with two or more redirects in a chain.

The good news is that wikis and sites optimised to have SEO friendly URL structures can take part in the rather handy sitemap XML.

Now if Google would just set up the vanity URL for the lazybones among us - I'd be happy.

Thursday, November 24, 2005


It's all Google; isn't it.

Google's just about noticed this new domain. The home page URL and the main blog index URL are in the index but the pages themselves have not been spidered.

Google Analytics
I was lucky enough to have a play with Google Analytics before it was released to the public. I thought at the time they had under-estimated how huge the impact would be. Google have just killed the cream in a previously cash rich industry. Who will pay for unsupported web analysis now?

Google Analytics' weakness is that it's entirely off the shelf and there's no support at all. If you're the type of webmaster or marketing manager who needs to have tracking set up "just so" then, even though it's free, Google Analytics may not be for you.

Google Analytics doesn't offer any bid management either. The system can take a peak at your AdWords costs and help calculate ROI but that's it.

Although the likes of Web Trends and Hitbox / WebSideStory are in big trouble, the 'Best of Breed' Coremetrics are fine and are bidding gurus like HitDynamics will be safe if they can push their product on.

Google Base
I like the idea of Google Base and I see its potential. The carrot being waved at us that what goes in Google Base might end up in Froogle or Google's main index. That's a pretty cheap and transparent carrot.

What Google Base lacks is the sense of community. I remember when Ciao and Dooyoo added the community features for the first time to their sites; it all took off then. I don't mean the advanced community features; just the ability to comment on other people's reviews.

I suspect we'll see more updates for Google Base. The URL is currently a blank page (rather than an error) so I suspect we'll see something there too. The original guidelines in Google Base insisted that any products added to the database must be for sale in the States and priced in dollars. That's just insane. That's anti-Google Base, in fact. We have Froogle US for that.

Click to Call
I think this is the most exciting PPC twist in a while (including the new contextual bidding rules). I pitty those SEO agencies who are contracted to long term deals with the minnows of pay-per-call technologies as they'll be fairly hamstrung now.

Google takes on the cost of the call. This could be interesting. This is also why, I guess, Google's been so interested in buying up dark fibres.

Monday, November 14, 2005

Jagger Summary

I've not seen anyone provide much of a summary of the Jagger changes. I'll cut all the interesting bits out of a summary I produced for a client and offer up bare bullet points here.


  • Anti-spam improvements: Google’s sought to reduce the number of spammy sites in search results by formulating more thorough and more accurate automated means of spotting hidden text (text hidden in layers was particularly targeted). Google also improved its ability to spot doorway pages (pages fill of keywords which serve no other purpose than to link to another page, often on a different domain).
  • PageRank update: The PageRank metre on the Google toolbar was updated.
  • Links update: The number of backlinks shown by the “link:” search command was updated.
  • Changes to Chinese and European search results: Google tweaked its geographical filtering. Most of the sites we work with and which are hosted in Europe saw fairly substantial gains on Google’s .com results. This suggests that the weight against sites hosted outside the US was reduced. Google now has more faith in other algorithmic means to determine a web site’s target audience.
  • In this period we saw high traffic sites which have been online for four or more years improve their search results over than younger or lower traffic sites.


  • Ranking algorithm change – deep links increase in importance: Links from external web sites to pages beyond your home page are now important. In this time new monitor stations gained PageRank 1 or 2 on their home page and those we test with deep links earned PageRank 4 or 5 on their popular deep linked pages.
  • Ranking algorithm change – keywords in domain names: Having a keyword rich domain name became more beneficial as the weight against keywords in domains was reduced.
  • Ranking algorithm change – hyphens in domain names: The weight against multiple hyphens in domain names increased.
  • Ranking algorithm change – measured traffic: Google has an idea of the traffic a web site has by monitoring click throughs from Google’s own search results and through users with the Google toolbar and other Google software. Pages (not sites) with particularly low internal traffic (from users already on the site) dropped in rankings in favour of pages which users visit and find more often.
  • The Jagger2 update included the results of the first sweep of spam reports which followed Jagger1.


  • Index change – Canonical corrections: Google improved its understanding of which is the most appropriate URL for any given web page and when two or more URLs may actually refer to a single site (for example, or
  • Ranking algorithm change – Freshness: Google changed the importance of a page being either fresh or stale. Certain keyword searches suit fresh pages more than stale pages (based on user behaviour) while other searches favour stale (not recently changed and old) pages over fresher (recently changed or added) pages.

Tuesday, November 08, 2005

Yahoo Vs Google Vs Your Tastes

Let there be no doubt that Microsoft is not the only company determined to take on Google. Yahoo certainly is too. This plaque for the Yahoo Mail team is controversial not only because it makes quite explicit who Yahoo has in their sights, that they're very serious about winning but also because it draws a parallel between Google and the Nazis. I don't think this was intentional. In fact, as a Brit I'm mildly surprised that the Americans used a British success as an inspirational model. I do think the photograph of the plaque shows just how "dangerous" search engines can be. Information takes on a very different meaning when it leaves a private space, looses it's tone of voice and becomes public domain.

A lot of debate the plaque has kicked up centres around the actual mail offerings from the two providers. Which is better? Right now I use Gmail because I can procmail most of my mail to it and have Google spam filter for me before then popping it back into Outlook. Bliss. Even with popfile (and a well trained orange octopus at that) my mail sorting took ages. I've a paid for Hotmail account too. Mind you, I was on the first (thousand) onto hotmail and will keep the account (despite numerous spam backlashes). By many accounts the new Yahoo mail will be a strong contender too - but how many email addresses do I need?

(And there's a Google tracking code in my link to Popfile because I was lazy, searched for popfile and copied the link location straight from a right-click. You just have to assume that Google filters unexpected referrer information out of its analysis)

One of the wins I believe the search engines can gain from offering great (therefore popular) mail systems is that of language and response analysis. It might be evil for Google to learn about you from the contents of your email. I suspect they'll not be that direct. However, what I believe Google will do is learn which email messages you mark as spam. That's a good way of finding rogue IP addresses or spam advertised URLs. Google must measure the clickthrough and keyword matches for the AdWords which line the side of Gmail's web interface. Whereas Google may not read my email to learn that I spend my free time roleplaying but Google is sure to notice that my Gmail account generates clicks for roleplaying-related keywords.

I welcome personalisation. I really don't mind if Google's machines (Gmachines™) scan my email or watch my surfing habits. I don't do anything with the RPG programming language, for example, nor do I have an interest in Rocket Propelled Grenades. I want my RPG searches to turn up Role-Playing Games.

I suspect too many SEOrs are avoiding personalisation issues because it could be a pain for them. I believe personalisation is an other offering for a good SEM firm to make. I do think the larger search marketing firms are at the advantage here. We can do the proper demographic research (and we do here). Crude reporting tools like WebPosition become pointless (or, at least, much less effective). I would expect more conversation on the forums and in the community about it but I wonder if the "sandbox" of 2006 will be the one in which many SEOrs bury their heads and hide from personalisation. It's easier to talk about funky new email systems and PageRank updates.

Thursday, November 03, 2005

Google Trivia #439

I have a SEO blog ergo I must post Google trivia. It's the law.

Google has expanded the range of options for the customised homepage (a carrot by which to lure people into Google Accounts and personalised searches). Finally those of us outside the US have a little more to play with.

I added Edinburgh's weather. Yeah. I must be a masochist. What's the difference between Edinburgh's weather in the UK and Edinburgh's weather in the US. The units change. We also seem to get an extra day of forecast in the UK.

There are some slight oddnesses too. The UK uses Celsius for temperature, true, but we're a mixed up lot and talk about miles per hour and not km per hour. Google gives wind speed in in km/h. You have degree of customisation by picking either or as your home page (or as the case may be) but minute customisation isn't there. Yet.

Tuesday, November 01, 2005

Googling the Future

It was during a Samhein party that I bumped in a world expert on machine translation and machine learning. It'll come as little surprise to know that he's off Googlewards.

I don't feel I should blog all the insights I gleamed from this teacher-of-PhDs, as we discussed at the time, information will be key to everything. Information is currency. I was cautioned against doing all those five minute surveys for 50p and the like as all this information can, will be, may already being carefully analysised and if things go wrong could be used against me.

Analysing lots of information is something that computers do well already, especially Google. Google has access to such a wealth of information it could perform a trend analysis on virtually everything. People were worrying about AI long before we realised that such much information could be processed and weighed at once.

This particular soon-to-be-Googler had another challenge and another point of interest. He was interesting in being able to pick up a seismically important piece of news on the first hint of its happening. By the time a piece of news is in the headlines it's old news. If Google News has stories that Tesco was in trouble (fat chance) then it's too late to sell your shares. The key is being able to spot the information the very first time it appears on the Web (or, say, in Gmail or Google Talk) and become aware of the likely consequences. A plunge in the stockmarkets is one such example (combine world beating stock exchange technology with Google's $6bn war chest) but the list goes on all the way up to that political rumour which sparks a war. My "Google contact" (not yet, must apply more beer) has done machine translation work for the military before. Translating from Chinese or Arabic pays well but the real treasure is being able to translate on the fly and then sound the alarm when one call mentions "we'll cut the grass tonight" or some other obscure code word which the computer calculates is likely to be hugely significant.

Monday, October 31, 2005


I've been on SEO forums and newsgroups since the day dot. I remember the glory days and although I find most forums to be clique filled speculation pits of ego boosting trash - I don't regret the time I spent on them in the early days.

I am glad that I've kept, more or less, undercover. I give up a forum alias if it becomes too widely known and move on. I just like to keep my head down - which is why it's taken so long for me to start this blog, why I'm not promoting it and why it's covered with disclaimers and caveats.

I had to abandon one of my aliases as searchenginewatch's forums (which are, I think, probably the best forums today) as it picked up a whole whack of reputation. I'd been posting with this alias for months, generally seeking out those posts with no or only a few replies and providing as helpful an answer as I could. You don't win reputation for anything like that. I made the mistake of linking to an updated page within Google's guidelines and that was enough to spark around of votes and "keep up the good work" comments (which grates against my arrogance as they sounds patronising ).

My current alias is about as easy to guess as could be so I imagine I'll abandon it in a little while - or perhaps put it out to stud, perhaps for return visits when I want to make a post in a semi-anonymous way.

On the topic of undercover forum use we have Matt Cutts and GoogleGuy. For the longest time the wide spread belief was that Matt Cutts was GoogleGuy. This is still likely to be the case. However, Cutts has begun to cite GoogleGuy posts in "almost" the third person over on his blog. He'll say something like, "Over at forum X, GoogleGuy has posted to say ... ". Of course, this does not rule out that Matt Cutts is GoogleGuy. He could be referring to the 'forum alias GoogleGuy has posted' and still be grammatically correct. Is it 'evil# to lead us to believe something by implication? A little. Whereas Matt might have been safer to leave well alone, I can see that he might also want to draw as much attention to GoogleGuy posts as possible in those situations where he wants people to read them and his blog is a good vehicle for that.

Saturday, October 29, 2005


Arhg! It's articles like this one at WebProNews which really annoy me. In some ways it's better than many WebProNews articles because it's not full of grammatical and spelling errors (which are fine in a blog but not in a professional publication). That said, this is still a dangerous piece of twaddle; rumour and speculation are presented as absolute truth and fact.

In this article the author states that Google punishes you if you buy text link links. Your competitors can go out and buy text links "on your behalf" to get you punished.

I wonder how he can square it against Matt Cutt's own explanation of Google's views and actions. It's the link selling site which is effected once Google cracks down. Google removes that site's ability to pass PageRank on. Sure, if you had text links from a site which suddenly lost it's ability to vote for you then you will take a dip in the SERPs. That's not quite the same thing as your own site being punished.

That said there is a time when a competitor could cause you trouble by buying you links; when your site is brand new. That is likely to cause your site to stand out (it'll be an outlier) and your site is much more likely to what the SEO industry has coined the sandbox effect.

Arhg! Do you know what currently comes top for that sandbox effect Google search? You guessed it - some useless speculation about what the Sandbox Effect is on WebProNews.

I have a small confession to make though. The other weekend I wrote an article for WebProNews about Search Commands. Nothing dramatic. The old joke about the Med, the broken Convert command, etc... it's not appeared. Guess I wasn't sexy or speculative enough to cut the harsh and very strict WebProNews editorial rules!

Friday, October 28, 2005

Yahoo! Search blog: Video Searching: Now Easier Than Ever!

Yahoo does seem to have the lead in video search. Yahoo's video search makes it easy to have your QuickTime and media files included and it covers all sorts of movie media on the web. Google's Video search, on the other hand, seems to only really bother with TV media and capturing it. I suspect Google could extend the sitemap XML to include specialties like video or audio. The tweaks to Yahoo Video are responsible for a big announcement over at Yahoo! Search blog but they are just tweaks. The API news is bigger and better.

Just wait a second. The other, earlier, entry into the Yahoo! Search blog is, I think, much much better. They've finally added a "Save to My Web" button and even gone a step further to let users search their tag clouds.

I think this really will help to promote Yahoo's Social Search activities. It was the orange RSS and "Add to MyWeb" buttons which started to spread throughout the blogosphere which really did push RSS from fringe to mainstream. I suspect the same will happen here again even though the My Web button is quite bulky. You'd be a die hard user to put the tag cloud on your web site but only last night I was toying with parsing the RSS feeds from my Social Search at Yahoo as a form of a "recommended links" page for ARHG net. I may just slap on the tag cloud instead.

Thursday, October 27, 2005

Google Travel

Yahoo is all over travel; they have the new Trip Planner in Yahoo! Travel and they bought Farechase. Since Yahoo! announced Trip Planner today the naughty among us might have expected Google to do something to divert some attention back their way.

It so happens eagle eyed Google users caught the search engine testing a new travel GUI. I've stolen this image from Search Engine Lowdown.

So what's going on here? Google's spotted a trend in the keyword search - that someone is considering going from A (Atlanta) to B (Madrid) and responded by inserting a specialised search box above the SERPs. We've the choices of Expedia, Hotwire and Orbitz. This is controversial - imagine you're a big travel company and you're not there. This is also very American; over here we don't have Hotwire or Orbitz. For example, Lastminute or Thomson aren't represented here. The lack of Thomson would be more significant as they're a huge travel company and do not operate through the likes of Expedia. Another tricky example would be something like the easyGroup who have a number of companies but not in one site, there's easyJet, easyCar and easyCruise to name but a few.

Google has a mixed track record here. Let's leap in and talk about toolbar three as it introduced the auto-mapping and auto-isbn features.

Auto-mapping allowed uses to press a button on the toolbar and get links to maps automatically added to the web page they're currently on in suitable places. Google had Google Maps at the time and has Google Local (the two merged) so they could have exclusively used Google's own maps but at the minute you can pick a different provider if you want. Sure, the selection is not huge but despite Google having a commerical incentive to push their maps there is some wriggle room here.

The auto linking of ISBN numbers is not so good. Press the button on the toolbar and you'll have any ISBN numbers on the page linked to Amazon. It's a monopoly despite other online bookshops being available. Certainly here in the UK Tesco Books have as wider and often cheaper selection.

There might be a circular situation too. If Google continues to improve it's results through personalisation and, at the same time, promotes the current market leaders then Google's users are going to show a "personal bias" towards these market leaders and so Google's personalisation will promote them further. If, for example, this travel GUI links to British Airways Holidays for UK users (if they make the distinction) then a lot of Google users are going to search and then click through to BA Holidays. This could bias their search results, through personalisation, in favour of BA Holiday

Wednesday, October 26, 2005


Is a Google domain? It shows a copy of Google currently.

Big deal. Lots of sites soft redirect to Google or just clone a few pages.

ping and ping - most of the time you'll get

Pinging [] with 32 bytes of data:
Reply from bytes=32 time=122ms TTL=241
Pinging [] with 32 bytes of data:
Reply from bytes=32 time=142ms TTL=241

Even so, that could just be an akamai thing. However, a tracert does firmly put the IP address in Google's hands.

Google Base

There is no such thing as Google Base. Yet.

We do have a rather compelling screen shot and a Google Blog post about the latest rumours.

I think it is likely that Google Base (and the name could change) is real. This is something Google would like to do. Google wants to index the world's information and a large community database is one good way to do it. There are pros and cons with a community database; as Wikipedia knows, a successful community database expands rapidly with a wealth of information but there is an issue as to whether the data is accurate. A wiki is good here as anyone can correct wrong information. A database with a single point of entry for each field is not likely to have that advantage. Google already has Google Answers where experts are paid to answer questions. This system has the benefit that the "good enough for the customer" answer is highlighted as accepted so other viewers can see the data had some worth. The screen shot of Google Base suggests you could add a "Database of protein structures" (this is a typically Googlesq example) and it certainly leaves me wondering how I can tell an accurate protein structure from one made up. There's the same problem on the main Google index, some could argue, but on the Web no page is in isolation. A page claiming to discuss protein structures and which is linked to by trusted authorities is more likely to be correct than a mess of a page which no one links to (that's how PageRank works, after all).

Classic. Here we find a blog posts which goes from discussing whether there might be - could be - will be a Google Base and concludes by discussing the pros and cons. We're still to see whether there will be a Google Base and we're certainly still to see how it works.


Having said that I doubted if I had time to update this blog often enough - I seem to be writing another entry.

I've sorted out the atom.xml feed. It's being published to ARHG Net correctly now. Phew.

Atom is an interesting one. It's there because RSS is too 'owned'. Atom's certainly more powerful but it's also a faff. Even when Google seemed so reluctant to go down the RSS route, RSS feeds were clearly the syndication option worth recommending. Yahoo had embraced RSS and most of the traffic driving syndication software had too. Further more, RSS is easier for affiliates to get their head around and therefore easier for them to hook into and use. Now Google seems to have some more time for RSS; you can add it to your version of the Google home page and you can get RSS content out of Google via Google News. The .rss file extension is still shown as unrecognized though and that's just Google being slow on purpose.

Although RSS is ahead of Atom in popularity this position is not sacrosanct. If RSS's 'owners' do try and through their weight around then I suspect Atom would simply scoop defectors up by the bucket load. I remember how many webmasters dumped their .gifs in favour of .jpg when the courts ruled .gif was patented.

Tuesday, October 25, 2005

There we have it

I thought I'd give UKShells a go, though I am yet to muck around and see whether I can have command line access to my web areas (for mysqldump commands or wiki installs, say). I actually tried to do this last night but it proved impossible to register new domains at set up because of a bug in their system. I rang sales today and was impressed; no hard sell, just friendly and fast action. I even picked up a token discount for my bug spotting.

So, we have with no content yet and for the blog. I know. How original. ARHG may be my initials but the temptation to type ARGH is still there.

Hosting companies are a strange breed. They're an example of one of those internet companies which require a mass of customers to turn a tidy profit but who begin to hemorrhage profit if they begin to get bogged down in support. I have accounts (my first) at UK Linux and at Fasthosts. I asked and Fasthosts support the same question at the same time, Fasthosts replied over night and I'm still waiting on Fasthosts support do get knocked awfully but the company does look after an incredible number of web sites. One day I may even have an account at RackSpace UK and I suspect it's most likely to be GameWyrd although I'll be pleased, in a way, if it turns out to be one of the new ones.

A start

You have to make a start somewhere. For me this start was October the 25th in 2005. It was a day where I've dashed back and forth between Hillside and Portobello as I attempted to move, piece by piece, the contents of my flat.

But wait. No. This is not that sort of blog. I have one of those already, it lives on LiveJournal where I think blogs like that belong.

This blog should be a different animal. I work in search engine optimisation, a form of internet marketing, search engine marketing if you would like (and some people do prefer the term). Each day I try and find enough time to check what the popular search blogs are saying, I'm checking for official news from the search engines who maintain blogs themselves or what the industry gurus think. I suppose we're lucky that we have an industry where the gurus are so loud on their blogs, in this case blogging and opinion go hand in hand with the industry landscape. We still influence each other.

I am lucky. I work for one of the best SEO firms in the world. No doubt of that. At times I find it ironic that I'm so busy I rarely get to read what other blogs say. I might be being a bit daft in thinking that I have the time to write a blog of my own. I'll try. I want to. From time to time I just feel that an industry or search engine development needs a comment. Whereas this blog is not the place I spleen daily foo, this blog will be the personal-professional vent for search engine and search engine optimisation news and views.

Just as I find it ironic that some people seem to have 16 hours a day to crawl the forums and update their blogs I also find it worrying at how often I disagree with what's been said and how often cliques seem to perpetuate status and standing. I suspect I'll skirt with controversy here.

That said, let's get this important disclaimer in; this blog represents my views and not the views of the company I work for. Personal views, such as those you'll find in this blog, change, evolve, twist and change. I reserve the right to flip-flop, u-turn, reserve or bathe in sea changes.

I'm also new to Blogger. Here's my first discovery; the Firefox version of Google's toolbar does not spellcheck properly in the "Edit HTML" of the Posting screen. The input window turns blue but no errors are found and you need to click the ABC button to disable the effect (rather than being able to mouse-click on the form and select "Stop" from there.

Here's the next discovery: if you start spell checking in "Compose" view, flick to "Edit Html" without turning the spellcheck off then the Html inherits all the style commands from the toolbar, going back to "Compose" hardwires these changes into the actual blog post.

The next step for me is to move the blog to a new domain.