This session as moderated expertly by Greg Jarboe. It featured three newspapers; the Telegraph, the Guardian and the Chicago Tribune.
Paul Roach, Technical Lead for SEO, guardian.co.uk
On a presentation entitled "Pushing the Crawlers Around" Paul asks by checking if the "woman from Google" is in the room. No? Good, he says, we'll get the full story!
We begin by looking at the Guardian's CMS. This was designed for users but also for search. The CMS makes use of "keyword pages". These aren't pages of automated content. These are automatically built pages from editorial content. How do the fair in Google's crawling; these pages are crawled every 30m by Google and this compares to 3m crawls of the Section pages.
Articles ping Google immediately on publication. These keyword pages help build links to these pages; in fact, hot topic pages, network front pages, section pages and a lot more all point back to article pages.
The goal isn't just to get the article pages indexed but to create intended pages.
During the Mumbai Terror attacks the Guardian created a keyword page and they rose to the top of the onebox and were outranking the BBC and Wikipedia for related search terms. Once created these keyword pages need to more human intervention.
Paul suggest the Huffington Post is another example of a publisher who does uses this technique.
A tip for publishers: establish your crawl rate, create tagging between these pages and then automate this process.
Julian Sambles, Head of Audience Development, Telegraph
When it comes to the Telegraph's priorities: number one is the core content, number two is the staff and number three is adaptability.
Preaching SEO as "hygiene" factor the Telegraph have made the effort to teach their technical staff and journalists about search, social and real-time. The core message is that online is very different from offline. There's a finite amount of space and a controlled flow in a printed newspaper. That's not the case online. The web is infinite and the points of entry different.
People are driven to search by other channels. This could be a TV program, breaking news or an advert. The Telegraph understand this. It means that entry to the website isn't always by the "front door".
The Telegraph have invested in making their stories stand out. "Up Yours Delors" "Gotcha" make for great offline headlines. This isn't the case for online. An important rule for the Telegraph is to avoid keyword stuffing. They won't, for example, mix Kate Winslet keywords with a stock market story. The journalists are trained on SEO and then they're in charge. It's the journalists, not the online marketing team, that get to headline the stories.
The best way to deal with real-time search is to train the newspaper's staff. This means they don't need to ask for help; they can just get on with publishing the story.
Brent Payne, SEO Director, Tribube
The Tribune optimises for Google, Google and Google. Perhaps a little bit of Bing. The rest are not worth it.
There are now more content ways to get into Google; images, music, maps, etc.
There are lots of specialised Googlebots, says Brent, these bots pick up small changes on the page. There's one, for example, to pick up title changes.
There are factors other than PageRank and links that help determine ranking. If you're a local site you may rank better for local stories, for example, or a link-free citation may also boost your position.
He discusses some of the recent Google News changes; re-visiting pages, coping with new URLs for updates, working out the originator of the story and expecting stories to be updated. Are Google actually living up to these changes, though? Brent's research shows that between 35% and 45% of Tribune content is re-crawled. Hot topics, based on searches, are more likely to be re-crawled by Google.
Google has slowed down how quickly a 301 redirect will pass the relevance from a changed news URL. Brent suspects this is to combat Google Trends spamming. This hinders the ability to publish stories and then 301 to an updated story later on.
The Tribune has also seen poor performance in Google's ability to find the original source of a story. Across the Tribune's newspapers, when they mention the paper that published the story first, they rarely see Google acting on this.
Publishers without a Google News Sitemaps are at a huge disadvantage. It's all about speed; getting Google News to detect the new story. The Tribune spends time watching Google Trends in order to see what people are searching for.
For any Celebrity news Microsoft's xRank is recommended.
Social and search are moving closer together. Brent suggests Google's found a crazy amount of secondary connections for social relevance. This raises privacy concerns but it is the future.