June 16th 2022

How would any of us successfully travel without maps?

We wouldn’t.

Maps are an important tool for us to use in in order to travel from A point to point B. And just like real life, maps are important for search engines to navigate through your website from one page to another.

So, how exactly does an XML Sitemap look like and how can you create one to give search engines crawlers an easy way to navigate throughout your website?

In this article I will not only explain what an XML Sitemap is, but most importantly I will give you some tips to maximise it’s potential SEO benefits!

What is an XML Sitemap?

An XML Sitemap is a list of URLs or pages accessible on your website. You can think of it as a map – or list of addresses – for your website that helps search engine crawlers like Google or Bing to better understand and efficiently navigate your site without wasting resources.

In addition to informing crawls of your website’s URLs, an XML sitemap can also provide valuable information about the pages of your website; for example, when the page was last updated, how often it is changed, and if there are any alternate language versions of the page.

You can read more about how XML Sitemaps work in this article: What is an XML Sitemap?

What are the SEO benefits of having an XML Sitemap?

Sitemaps are useful for technical SEO or to better crawl and index your website. Technical SEO is the process of improving the technical aspects of your website to increase your ranking in SERPs, meaning search engine result pages.

XML sitemaps tell Google what:

  • To Crawl and index Google/Bing or other search engines will be notified of new pages much faster, which will result in a faster indexation process and display of the website in search results.
  • The architecture of your website – When crawling a website, search engines can find pages that are deeply hidden in your website architecture but weren’t discovered before. There is no guarantee that all of them will be indexed, however.
  • When your content was last updated – Using the tag in your XML Sitemap you can notify crawlers when the page was last updated. This will help search engines to reindex your page, and keep up with the changes.
  • Translated versions of content – Defining “alternate” versions of your pages in different languages within your XML Sitemap file will help search engines to “group” pages together and assign the correct language to the user.

Best sitemap practices to follow to boost your SEO

  1. Generate sitemaps automatically – With the right tools, such as a built-in XML Sitemap generator or popular plugins like the ones for WordPress and Magento, you can easily create a dynamic sitemap without much manual action. Dynamic sitemaps are those that automatically update when new pages are added or removed based on a set of rules.
    As a result, it will stay updated, which is another sitemap best practice you should follow.
  2. Place the sitemap in the root directory – Placing the sitemap in the proper place makes it easier for search engines to discover it. That is the root directory of your HTML server.
    Example: https://mywebsitedomain.com/sitemap.xml
  3. Include only canonical URLs in the sitemap – The XML Sitemap should only contain URLs that you wish to be indexed by search engines. A canonicalised URL is an explicit statement to search engines that you do not wish for the URL to be indexed, and instead wish for the canonical URL to consolidate indexing signals. Therefore, including a canonicalised URL in a sitemap provides conflicting information to search engines, and may impact what they consider to be the canonical URL, which may result in unintended URLs being indexed.
  4. Remove pages with errors – Before submitting your sitemap to Google, a good SEO practice is to use a tool like XML Sitemap Validator or ScreamingFrog, for quickly weeding out errors in the sitemap XML files. Errors in an XML Sitemap could be generated by pages that search engine can’t crawl, 40X errors, too many URLs in the sitemap, non-canonical URLs. Search engines might not be able to index every page on your sitemap if they contain any of these errors. And reading XML files manually to find issues can take a long time, and you also need to test URLs. So, sitemaps validators can enable you to skip all that work and fix the errors.
  5. Avoid including pages with 3XX redirections – As John Mu, Webmaster trend analyst at Google states on Twitter, your XML Sitemap should contain exclusively URLs that you want to index. So, avoid including URLs that have been redirected.

6. Include only the canonical URL of paginated pages – When executed properly, pagination helps search engines to to understand what the relation between a series of pages is. By doing so, we prevent possible duplicate content due to thematic similarities, and ensure that bots index the relevant URL. That’s why, a good practice is to include in the XML one the canonicalised URL on the paginated pages.

7. Submit your sitemap(s) to Google – After you’ve created the sitemap, and uploaded it to your website, you must then submit it to Google. If you have used a plug-in or extension to automatically generate the sitemap, then there is no need to upload it to your website, these tools will do that for you. Also, Google Search Console will show you if there are any pages with errors left.
8. Create multiple sitemaps for bigger sites – The official protocol for XML sitemaps, sitemaps.org, states that a sitemap should not contain more than 50,000 URLs and be no larger than 10MB. The most popular search engines (Google, Bing, Yahoo), seem to follow these size recommendations. You will need to use multiple sitemaps if your site has more pages than this. The sitemap index file will identify each of the sitemap files for the search engines, allowing for more efficient indexing.
9. Deliver the appropriate sitemap to crawlers – International sites with multiple versions of pages in different languages or geo-targeted locations could also benefit by correlating their sitemaps. Serving the appropriate version of your website to users helps you not only to ensure the best user experience, but to also help search engines understand your website. Our free SEO Hreflang Tool can help you to better map your international sitemaps and deliver the appropriate one to users.
10. Include your sitemap URL in the robots.txt file – A robots.txt file is a simple text file which is located within the root directory of your website. Using the robots.txt file you can “tell” search engines which pages to crawl (and what pages not to crawl) and also which search engines have permission to crawl your site. Adding the URL of your XML Sitemap in the robot.txt file is always a good SEO practice.

Conclusion

Just with little time and effort, using an XML Sitemap you can ensure that your website in rapidly crawled and index. These SEO best practices will help you generate sitemaps the right way and ensure that you maximum benefit their purpose.

If you have any questions about XML Sitemaps, please feel free to contact us.