Ensure your site is search engine-friendly and you’ll rocket to the top of Google. Margaret Manning, CEO of Reading Room, outlines the fundamentals of SEO and how to apply them in the real world, including a look at the latest trends and emerging search technologies
Search Engine Optimisation is becoming more and more important to every site. With more than 113billion searches conducted in July 2009 alone, the volumes speak for themselves. Some of the techniques used have given SEO a bad name, but while there are many techniques that are less than honest, there are several simple ways to ensure your site benefits from natural search results.
Search can be classified as either organic (natural) or paid. Organic (natural) results are those that occur naturally in search engine results pages (SERPS) and high results depend on both the technical construction of your site and the content within it. Paid results, often referred to as Pay Per Click or PPC, are the results site owners pay for and which usually surround the organic listings.
Research shows web users prefer organic listings to paid listings, considering them more relevant and trustworthy. The goal of SEO, then, is to improve your organic listings performance, which in turn should boost traffic to your site.
Search engines index the web using large clusters of computers, known as bots, which spider the web by following links found on web pages. These URLS are populated into the search engine indexes and it’s this index that’s queried every time a user performs a search.
Search engines employ complex mathematical equations, known as ranking algorithms, to order search results. Google’s algorithm alone relies on more than 200 individual factors to decide which result, in which order, to return to its web searchers. Organic SEO can be further split into two categories:
- On-page: The code and content you use to manage and deliver your web pages.
- Off-page: External factors effecting SEO. This is primarily focused around link building – getting other websites to link to your content.
Here we’ll focus on on-page optimisation methods, which are all under your control. The most important thing is to maximise accessibility to ensure search engines can find all your content.
There are two ways to get discovered by search engines. One is to submit your site directly to their index (Google; Yahoo; Bing). The other is to wait for them to find it through links to you from other sites during their crawling process. For more information on Google’s crawling process, see here.
To make sure your website is accessible to search engine spiders, follow these simple steps:
- Ensure you’re not preventing the search engines indexing your site via The Robots Exclusion Protocol with use of a robots.txt file, which is used to give instructions to search engine bots. More information on this can be found here.
- Ensure your content is machine-readable. Avoid using Flash, video or imagery to exclusively house your content. Remember, search spiders cannot see images or video: they can only read written text on a web page.
- Ensure you have a clear internal linking architecture. Promote important content to the homepage and link to key site sections via dedicated navigation. Group content into clear site sections reflected in your site navigation to aid both users and search engines. For example:
- Eliminate duplicate content. This could be caused by the way your server is set up or how your CMS serves up content. Either way, this needs to be addressed. We’ll cover how to fix the most common duplicate content issues later on.
- Ensure you’re targeting the appropriate keywords for your business objectives. Just as successful advertising campaigns contain content that appeals to a target demographic, successful websites need to focus on keywords that have the highest relevance to their target audience. Researching keywords for your website is integral to your internet marketing and SEO strategy and should provide you with: insight into popular search terms relevant to your brand/site; the ability to target valuable search traffic to your website; and an improved user experience by providing relevant content. Some key metrics to investigate are:
- Keyword popularity: This details monthly search volumes for each of your keywords, providing a great indicator of the size of the potential market.
- Keyword competition: Analysis of the level of competition for your keywords. Generally the more competition there is, the harder it will be to perform for this keyword, both in terms of SEO and PPC (pay per click).
- Trend analysis: Find out when the best time of year is for pushing a particular keyword. This is a simple and great way to help choose your content and budget your online ad spend.
- Keyword targeting: Ensure your content pages are optimised for a chosen few keywords. The key here is to understand your objectives and therefore what words relevant visitors will use to find you. Sounds simple, but requires some thought!
The bad and the ugly
There are many techniques used to achieve high search results and the industry roughly splits these into two categories: Black Hat and White Hat. Black Hat SEO refers the practice of deceiving the search engines to artificially boost your rankings. It’s always been popular as a seemingly fast and easy way of achieving high results, not on the merit of your content but by hoodwinking the search engines into the belief that your content is relevant to that term. The good news for users is that as soon as Black Hat techniques are devised, the search engines find ways of spotting them and penalising sites that use them. Some examples of Black Hat techniques include:
- Gateway pages: Creating highly optimised pages targeting a particular search term, only to redirect the user to a completely different web page. Most search engines consider this practice highly dishonest and penalise offenders accordingly.
- Keyword stuffing: Overuse of keywords in an attempt to trick search engines into thinking your content is more relevant than it is.
- Hidden content: Positioning large chunks of text off-screen or hidden using the same text colour as the background.
BMW’s death penalty
In 2006 Google delisted BMW from its index for breaching the search engine’s terms and conditions by adopting Black Hat gateway pages to rank for the key phrase ‘used car’. The gateway pages were optimised for the term but when a user clicked on the results they were redirected to BMW’s regular German homepage. If you’re interested, you can read more here.
This was a huge turning point, highlighting the dangers of Black Hat techniques and proving that it isn’t just the small boys that can get penalised. Top tip: if it sounds like Black Hat – or even a bit grey – avoid!
A recent development that’s causing much excitement is the concept of real-time search. “As social networks have evolved, the idea of searching what people are writing on the internet right now has become a reality,” said Larry Page, speaking at Google Zeitgeist Europe in May. “At first, my team laughed and didn’t believe me. Now they know they have to do it. Not everybody needs sub-second indexing but people are getting pretty excited about real-time.”
The surge in popularity of Twitter is the core driver behind the wider acceptance of real-time search. Twitter’s search enables anyone to search what users are talking about on the web as it happens. As posts are made, they’re instantly indexed and included in Twitter’s search.
Initially, Twitter hid the search facility at search.twitter.com, but in an indication of how the importance of real-time search has increased, it’s now available directly from the Twitter homepage. This is almost certainly in response to Facebook releasing a much-improved real-time search engine, alongside other real-time search engines such as Scoopler.
The power of real-time search is clear. Results in real-time are more relevant. News stories in particular can lose impact very quickly over time. Within a few hours, the news can have spread from the original source or ceased to be of any relevance. To be in the know, more and more people are turning to real-time search resources.
So you’ve achieved a high ranking in the search engines. But your work isn’t over yet. A successful SEO strategy also needs to consider how your visitors will be treated once they arrive. That’s because a good SEO strategy isn’t just about visitor numbers but about visitor conversions. The way conversions are measured will vary from site to site – it may be a sale made online, or an email address collected, for example. A suggested formula for measuring the success of transactional sites such as these is: Business (B) = Visitors (V) × Conversions (C) × Loyalty (L). This process is called Landing Page Optimisation and is all about understanding your users and optimising your web pages to boost the number of conversions.
While this topic is worthy of an article in itself, the main elements are focused around a principle developed in 1989 by Elias St Elmo Lewis, who mapped out the stages involved in a consumer completing a purchase. Today this process translates perfectly onto the web and is another fundamental element of a successful optimisation strategy. The decision process is made up of four key stages, which can be remembered using the abbreviation AIDA.
- Awareness (Attention): If the visitor can’t find something on your web page, it may as well not exist. Think about the main goal of the particular page: are you trying to get a user to complete a form or buy a product? Whatever it is, ensure that you have a clear call to action.
- Interest: Now you have the visitor’s attention, you’ve got a few split seconds to interest them before losing them with a click of the mouse. The key to creating interesting pages is to understand your visitors and tailor pages to meet their needs.
- Desire: Now that the visitor is interested, do you have what they want? Aim to make them feel safe, appreciated and under control. Client testimonials, video demos and facts and figures are great ways to convince a user your product is right for them.
- Action: At this point we’ve convinced the user to take action and complete our goal. The key is to make sure that the steps leading up to completing the goal are as streamlined as possible. Reduce the number of required fields in a form, remove steps from the sign-up process and aim to make the process as easy as possible.
The virtuous circle
Don’t be lulled into the false sense of security that once a site is live, your work is over. Constant monitoring of your site in order to continually improve performance is a must. And remember, the real experts on the design of your landing pages are your website visitors.
Recording information about your visitors is essential, both in terms of understanding how people are finding your website and what they’re doing there, as well as helping you plan for the future. In addition to visitor numbers, locality and pages per visit, analytics packages can provide invaluable insight into how your website is performing. Equipped with this information, you’re able to base further improvements on solid data. Some examples of interesting reports to run on your websites are:
- Top landing pages vs bounce rate: Discover the top landing pages (where users enter your site) and highlight any high bounce rates (when a user instantly leaves your site). Use these stats to identify pages that need improving.
- Most valuable users: Search traffic, returning visitors, UK visitors – segment user types and identify which are converting the most.
- Goals tracking: Identify and track goals to calculate your site’s conversion rate – the percentage of visitors who complete a goal. Correctly tracked, you’ll be able to compare conversion rates from paid search traffic, natural search traffic and display advertising.
Analysing visitors and adapting your site to improve user experience can yield fantastic results, build advocacy and help your site grow. A perfect example is the story of how by simply changing a button on its site, a leading online retailer increased revenues by $300m.
Writing for SEO
When writing content for the web, it’s important to think about your audience. Every website is different, but by following a few simple guidelines you can produce content that’s friendly to both users and search engines.
Before you start writing, ask yourself: who is my audience for this content? Why should they read this? What keywords should I target? Apply this thinking to the following areas of content:
- Friendly URLs: Boost SEO and make your URL easier to remember, improving user experience: eg www.mysite.com/recipe/rich-tea-biscuit.
- META description: Use the META description to provide a concise and accurate description of your content to entice visitors from the SERPS. Eg: ‘Make the best tasting rich tea biscuits to enjoy with your afternoon cup of tea using our favourite recipe’.
- Page title: Choose a title that effectively communicates the topic. Eg: ‘Rich tea biscuit recipe – Make the best rich tea biscuits’.
- Headings: HTML heading tags are used to structure your content. H1 is the most important tag and should be similar if not identical to your page title. Less important headings can use H2-H6.
- Page content: The first 100 words of content carry more weight and should be used to provide a brief description of your content.
- Use of images: Choose a descriptive filename eg rich-tea-biscuit.jpg. Use the ALT attribute to describe images: for example, ‘The perfect rich tea biscuit’. Provide an image caption to add context.
Duplicate content is a big problem for both webmasters and search engines. It stems from duplicate versions of the same content being accessible via different URLs.
Duplicate content is widely used in Black Hat SEO to deceive search engines into indexing extra non-original content in an attempt to gain more exposure. But search engines may blacklist your site for non-intentional use of duplicate content.
An example of duplicate content issues can be seen as follows:
All of the above URLs would load the same content, which makes it hard for a search engine to determine which is your ‘master’ URL and also means that users may be linking to your content via multiple URLs. There are different ways to fix the issue, including updating your CMS or creating an XML sitemap of all your website URLs, but the three we’re going to cover here are:
- 301 redirects: This method permanently redirects a particular URL to another. It’s typically used when a website has moved server or the URL structure has changed. Configuring 301 redirects depends on the website/web server set-up. Check out this guide to the most common environments.
- Canonical tag: In February, Google, Yahoo and Microsoft announced support for a new HTML link element to combat duplicate content. The canonical link element tells search engines which version of the content is to be considered as the master. To implement the canonical element, you need to place the following code in the head section of your page template: <link rel=”canonical” href=”http:// example.com/page.html”/>. This tells search engines only to index the master URL.
- Pre-built plug-ins: A number of pre-build plugins are available for popular CMSs, which assist in remedying any duplicate content issues, including WordPress, Drupal and Magento.
The future of SEO
The search engine landscape continues to become more complex and intriguing. The launch of new search engines such as Wolfram Alpha and Bing; the partnership agreement between Microsoft and Yahoo; Google’s ‘Caffeine’ update; Facebook’s purchase of FriendFeed…
But what does all this mean for SEO? Do we need to develop advanced techniques to keep up with the advances in search technology? In a nutshell … no. Stick to the basics, create useful content, build on an accessible and adaptable platform, follow common sense and best practice when creating content and you’ll fare well.
Essentially, search engines are trying to behave naturally. If your site is more useful than your competitors’ and answers your users’ queries better, then it’s in the search engine’s best interest to promote your site.