Skip to content

James Rutherford

Web, design, Newcastle, games and fun!

Archive

Tag: SEO

Dom encouraged a simple approach to SEO. His points included:

  • Pick your niche: Look at what people are searching for in your market
  • Basic structure: Get decent URLs, title and heading tags
  • Content: Produce something worthwhile
  • Be realistic: High search placements are slow to gain and easy to lose
  • Remove as much duplicated content as possible
  • Judge your sources: Don’t read any SEO blogs!
  • If you’re outsourcing updates (as with any other outsourcing), check everything
  • Don’t do everything at once: Test the impact and improve incrementally
  • Be savvy: Fact check any lines you may be being fed by an SEO agency

Areas relevant to Google for you to look at now or in the near future:

  • Google product search
  • Google local
  • Universal Search (blogs, tweets, video, etc)
  • Microformats

Dom presented a few Twitter anecdotes in his unique style, which illustrated:

  • The importance of context – information which you may not be communicating
  • That without context, meaning may warp or amplify
  • Anyone may be listening (e.g. potential future employers)
  • Strangers may go to great lengths to pull a prank (well, Dom and Tim may, at least)
  • People will interact and respond in unpredictable ways

References: Dominic Hodgson@thehodge

Related: BBC News – Be careful what you tweet

Andrew introduced a free SEO analytics tool released by Microsoft.

The tool is distributed via the Microsoft Web Platform and runs on Vista and Windows 7. [The SEO Toolkit is independent of other components- no IIS server install is required].

The tool parses as a search engine would, so can be used to examine sites built with any underlying platform (e.g. PHP, JSP, ASP, flat HTML, etc..) and will highlight a range of potential problems. It will analyse both locally hosted and publicly-released sites, so is useful for both development and audit.

It has three main features – site analysis, robots editor and sitemap editor.

It includes a powerful query engine and is extensible (in VB.NET or C#).

A very useful talk for me- this is a tool I’ve not encountered, but I could see it fitting very appropriately into my web development workflow.

Slides and talk are now up on the SuperMondays blog.

References: Andrew Westgarth@apwestgarthCarlosAg blog

Tim isolated Google as the only worthwhile ‘optimisation’ target, and introduced some common fallacies. His notable points were:

  • Google does not crawl meta data
  • Keyword density isn’t important. Mentioning a term once on a page should be adequate
  • Content is not rated by semantic relevancy
  • Buying links is officially frowned-upon, though realistically, links are very commonly ‘bought’ in one way or another- even from Google themselves
  • Markup structures are important, but not as important as most SEOs say
  • Google does analyse page structure to discount footers, advertising blocks, etc. (see ‘page segmentation analysis’).
  • SEO advice is predominantly [erm...] cobblers.

Both Tim and Dominic (speaking later) recommend SEO Dojo as a (rare) worthwhile source for SEO information.

Tim accepted a few questions, outlining how you might become relisted if Google have penalised you [with a 'reconsideration request']; how problematic penalisation can be [usually 'not very']; some potential hostile SEO tactics and how you might try to recover from them [perhaps employing a 'Reputation Management Consultant', though Tim's general feeling on those in this role is less than glowing...]

References: Tim Nash@tnash