HTML and XML Sitemap Creation Process

Sitemaps can be helpful in the indexation and crawling of a website. Below are the two types of sitemaps that we currently use, and the process for making an HTML and XML sitemap, as well as keeping them up to date.

Many times companies have the question of, “who should own the sitemap creation process?”. In our opinion the HTML and XML sitemap process and creation should be owned by the SEO team. They should work closely with the content and development teams for defining sitemap strategies and keeping the HTML and XML sitemap up to date.

XML Sitemap


An XML Sitemap is created based on specific protocols and helps the search engines discover pages that they may not be able to find due to poor website architecture or sites built in flash or another poorly crawled architecture. These sitemaps do not directly help with rankings and simply give the search engines a list of URLs that are on a website. Sitemaps should not contain pages that are redirected, or pages that 404.

If a website has many different sections with multiple pieces of content per section, it might make sense to create a unique XML sitemap file for each section. This helps with monitoring indexation rate per section, and can help identify areas of the website that, for whatever reason, are not being indexed.

XML Sitemap Creation and Editing

  • New page is created or deleted
  • Page is added or removed from to the xml sitemap
  • New sitemap file is sent for upload
  • QA is done to ensure proper deployment
  • Google webmaster tools is notified of the changes

For specifics about creating an xml sitemap please visit Sitemaps.org

HTML Sitemap


An onsite HTML sitemap can be helpful to the search engines because it gives them a path to crawl content, but is not an optimal way to pass many of the algorithmic metrics that increase rankings or traffic.

Edit or Creation of an HTML Sitemap

When a new page is created or deleted, the HTML sitemap should be edited. This is one of the only ways the search engines can crawl the site and apply metrics for sites built in Flash or with Javascript navigation, which makes it vital that this page is up to date.

  • New page is created or deleted
  • A new/direct link is added or removed from the sitemap
  • SEO and the QA team validates the link by checking to make sure it resolves to the correct page