August 14th 2012

This post is a case study of our experiences in pioneering the implementation of Hreflang sitemap solutions back in 2012.

When first launched in December 2011, there was one method for implementing Hreflang on your website. This involved adding the appropriate mark-up to each page of your website, telling them page X was for one particular language and/or region/country, and that page Y, Z etc, were for other respective locations.

Google gave us guidelines on how to plan what to do here, and essentially you would add something like the following code to the <head> of all pages of all of the websites:

<link rel=”alternate” hreflang=”en” href=”” />

<link rel=”alternate” hreflang=”en-gb” href=”” />

<link rel=”alternate” hreflang=”en-us” href=”” />

<link rel=”alternate” hreflang=”de” href=”” />

Example taken from Google’s rel=”alternate” hreflang=”x” guidelines.

The examples show a sub-domain being deployed but the beauty of this is that it works across different domains, sub domains and could even be used on a sub-folder international SEO setup.

Whilst this is a good way to tell Google about what content is for which location, it is not without issues.

You have to produce a map in advance so that you can plan which pages are linked together in this way, you have to determine what region and/or language you are going to going to be targeting for each version of each page, and you have to plan the mark-up that is going to be deployed.

On a medium to large scale ecommerce site, this mapping could involve a hell of a lot of work, let alone working out the deployment of the code. If you are doing this for several languages, for several thousand pages, it just seems overwhelming.

rel=”alternate” hreflang=”x” Sitemaps

Around May of this year, Google announcing another way of doing this, through the use of XML sitemaps, and not too long after, we started working with a customer that this was perfect fit for.

The XML sitemap approach allows you to take the largest element of the work out of the equation, as you essentially just have to map the variations, work out the country code for each region, and then use that map to create the appropriate XML sitemap.

I spent a lot of time talking to a lot of SEO’s, and it seemed that everyone wanted to implement this, but had no idea if it would actually work as intended, as well as encountering from potential nervousness from clients about implementing this, presumably because there was some visibility in all regions and there was no desire to go rocking the cart unnecessarily.

For our client, it made perfect sense – they operate in 3 regions, the UK, US and Australia, with top level domains for each market. However, the .com website was ranking in all regions, causing all sorts of billing headaches and the like.

A few methods had been tried before our involvement such as IP delivery and currency selections, but none of these worked perfectly to get to the bottom of the issue – users were reaching a website that wasn’t appropriately priced for them.

We agreed that we would proceed with implementing this sitemap approach after tidying up a number of other necessary changes across the group of websites.

The approach we implemented for this, led us to create an XML sitemap along the lines of the following:

<urlset xmlns=”″ xmlns:xhtml=””>



<xhtml:link rel=”alternate” hreflang=”en-us” href=””/>

<xhtml:link rel=”alternate” hreflang=”en-gb” href=””/>

<xhtml:link rel=”alternate” hreflang=”en-au” href=””/>



It was, of course, quite a bit bigger than this – this mini sitemap gives the relative variations of the home pages that we used for each region. For each additional <url>, you would set the main <loc>, and then give details on the alternative used in each sub region.

There are also further details from Google about the implementation of XML sitemaps.

For starters, as it was the URL that was ranking in all regions, we decided to only implement this for the .com website (something I’d probably think differently about and produce extra sitemaps for if there were lots of sites ranking in the various regions).

If you did want to implement this on each site, as I understand it, you’d need to create a variation of this for each specific website, as the main <loc> would vary from site to site – but this assumes top level domains and is something I’d want to examine on a case by case basis.

The Results

After a week of impatiently waiting, and checking on a near hourly basis, something began to change. Rankings for the site where the .com had always ranked, despite this being the US centric website, started to show the domain instead.

I quickly checked visibility for a number of keywords we’d seen to drive traffic to the website and searching from the UK, everything had switched domains. I rushed off to check US and Australian visibility, and in every instance, the correct website was showing to the correct region.

So, in short, it worked perfectly! I’ve come back to it today, and can see marked increases in traffic from search for both regional domains since this has started to kick in, and the visibility in each region is still leading users to their correct version of the website.

I should note, that as I had expected, rankings didn’t change at all as a result of this. It really wasn’t the goal – it was all about making sure that the content being served to users was correctly localised to them.

I think that this something to think about and plan very closely – get the implementation of this technique wrong, and it could cause all sorts of havoc to visibility (if you say that the UK site is Australian, and Google pay attention to this, it’s clear to see what could go wrong).

That said, in the right circumstances, this is clearly a very powerful tactic to serve up the correct localised results in each region that you are targeting. When coupled with other international SEO work to increase visibility of those websites in each region, you can really give international targeting a good kick in the right direction, particularly when starting out in a new region for the first time. It seems that this has effectively allowed us to piggy back off visibility one of the existing web properties already had.