Cloudflare's outages will harm your SEO
I really like Cloudflare and I recommend the service.
They've had a spot of bad luck last week, though. First there was a router fail in London which impacted some sites.
Then there was a bigger issue on Saturday which hit many more sites and made the news.
There will be an SEO burp as a result of this for some sites - but hopefully a short one.
If Google can't access your site's robots.txt then it's cautious about crawling your site at all. As it happens, some sites will have been wrestling with the Cloudflare outages just when Google was trying to access their robots.txt. Some of these sites will now be on Google's watch list. On of my blogs, Geek Native, just got the bad news email from Google.
There's not much that can be done about it other than wait. I've used Fetch as Googlebot on the robots.txt to test and confirm Google is happy now but I doubt that'll hasten the recrawl. It's interesting to note Google uses the double slash in their email between the domain and the robots.txt file name, though, typo?
Note: There could be some other weird reason why Google failed to access robots.txt on the site this week - but there's been no other outages.
They've had a spot of bad luck last week, though. First there was a router fail in London which impacted some sites.
@andrewgirdwood bandwidth provider had a router fail in London. We've routed around until they fix. Sorry for the trouble.
— Matthew Prince (@eastdakota) September 10, 2012
Then there was a bigger issue on Saturday which hit many more sites and made the news.
There will be an SEO burp as a result of this for some sites - but hopefully a short one.
If Google can't access your site's robots.txt then it's cautious about crawling your site at all. As it happens, some sites will have been wrestling with the Cloudflare outages just when Google was trying to access their robots.txt. Some of these sites will now be on Google's watch list. On of my blogs, Geek Native, just got the bad news email from Google.
There's not much that can be done about it other than wait. I've used Fetch as Googlebot on the robots.txt to test and confirm Google is happy now but I doubt that'll hasten the recrawl. It's interesting to note Google uses the double slash in their email between the domain and the robots.txt file name, though, typo?
Note: There could be some other weird reason why Google failed to access robots.txt on the site this week - but there's been no other outages.