The SEO Health of Denver’s Government Website (@MayorHancock)

Share on Pinterest

We are always interested in how the SEO for Government websites is done, and if it’s done well. For this website audit we looked at some high level SEO health variables (a small portion of the over 250 that Google uses) of Denver’sGovernment Website,, to determine how well they’re doing across some key onsite and technical SEO variables.

We also offered some free SEO advice and insights to help them better SEO their website.

Common SEO Challenges of Government Websites

Whether you are a large government website or a local municipality that is trying to manage your digital presence, government and municipalities have very similar challenges when it comes to digital strategies such as SEO, social media (Twitter, Facebook, Pinterest, Instagram, Tumblr, etc), and content strategy.

  • Multiple Microsites Instead of a Unified Focus: This can cause duplicate content and user confusion.
  • Budget and Funding Challenges
  • Content Regulations: This can limit what can be targeted by the content sets
  • Reputation Management: There are always unhappy people within a community, that with the openness of the web, can write anything they want about a local or state government.
  • Websites Lack In-depth Content: This can cause gabs in keyword targets and a lack of increased traffic.
  • Long Review Processes: This can inhibit an Agile approach to optimizing a website.

What We Did

We crawled a large sample of the website to get a better idea of the SEO health of the website.

Our sample consisted of about 77,000 pages (yes this is alot for a government website), which in itself, causes us to think there are some website architectural issues causing duplicate pages – which means the Google crawler (Googlebot Persona) could have trouble understanding the website.

Internal URL Type Breakdown

Let’s first look at the type of pages that our sample returned. As you can see from the data below, the majority of pages were HTML – but their were also a large number of PDFs on the website.

What We Found on Denver’s Government Website
internal content types
HTML: 70,000
JavaScript: 200
CSS: 223
Images: 2421
PDFs: 4200
Flash: 11
Other: 311


The technical breakdown analyzes variables that align with how easy the website is to crawl and understand. The variables also align with Google’s ability to find all the pages on the website for inclusion in their index.

Response Codes

Response codes are the response that the web server gives to the search engine crawler. This response defines if the page is live, permanently redirected to a new page (302 redirect), temporarily redirected to a new page (301 redirect), the page is not found, or the server was too slow to respond.

What We Found on Denver’s Government Website
seo response codes
200 Response – webpage found and is ok: 24,000
301,302 Redirect Response – webpage is redirected to a new page: 45,000
404 Response – webpage is not found: 10,461
500 Response – server had a problem returning the webpage: 7

SEO Insight
The most troubling part of this section of the analysis, is the large number of redirected pages. These redirects not only cause ranking value to be lost, but it also is a concern that a large number of pages have had their URLs changed, but the internal links on those pages have not been updated to represent the new pages.

I would update as many of the internal links as possible to help crawlability and value.

URL Structure

A consistent URL structure that follows basic URL rules helps Google (and users) understand the organizational structure of the website. URLs should be all lower case, dashes used for spaces, parameters removed, and under 115 characters.

What We Found on Denver’s Government Website
URL structure
Underscores: 1,300
Uppercase: 12,300
Duplicate: 169
Parameters: 1,455
Over 115 Characters: 66,484

SEO Insight
What we found that was there is major concerns with URL structure. This can cause unwanted SEO results, and Google or user confusion.

Where as I would like to see consistency with URL structure, due to the size of the website, it might make sense to put in a governance policy for any new pages that are created and leave the current URLs intact.


Directives help the search engines better understand what you want them to do with your site, and how you want them to crawl and index it.

  • Canonical Tags tell the search engines to consolidate duplicate pages towards one canonical version that the website owner wants to be the primary page for the group of duplicated content.
  • Index Tags tell the search engines to index the page – although it is not necessary to tell them to, they do this by default.
  • NoIndex Tags tell the search engines to omit this page from their index.
  • Follow Tags tells the search engines to follow the links that are on the page – again this is not necessary, as they do this by default.
  • Nofollow Tags tell the search engines to not follow the links on a page.

What We Found on Denver’s Government Website
search engine directives
Canonical: 2,380
Index: 15,180
Follow: 15,180
Nofollow: 5

SEO Insight
A lack of consistent use of Canonical Tags can be an issue for sites that have a large number of pages.

Not only should Canonical Tags be used to consolidate ranking value for duplicated page, but for pages that are not duplicated there should be a self referencing Canonical Tag to help with managing duplication that is out of the control of the website owners.

We also see that they are using the Meta Index, Follow Tag – which is not necessary since Google does this automatically.

Response Times

Response times can hinder Google’s ability to crawl a website, and negatively affect the user experience. For extreme cases Google can lower the ranking of a page that has a high response rate.

What We Found on Denver’s Government Website
website response times

SEO Insight
We didn’t find concerns with response rate for the website.


Page Titles

Page Titles (Title Tags) are one of the primary on-page signals that Google and other search engines use to understand the topic relevancy and target of a web page. Title Tags should be keyword focused, entice the user to click on them when shown in the search results, and be under 70 characters.

What We Found on Denver’s Government Website
seo page titles

Missing: 16
Duplicate: 14,019
Over 65 Characters: 1,180
Below 30 characters: 10,459
Same as H1 Tag: 71

SEO Insight
The first concern we notice, is there are a large number of duplicate title tags. This can cause Google to filter many of these pages from showing for relevant search terms – thus causing a loss of opportunity to rank for topics that these pages target.

If these pages are found to be valuable from a user or SEO standpoint, I would rewrite the Title Tags for these pages to help with indexing and relevancy matching.

Meta Description

Meta Description Tags don’t have a direct impact on search rankings, but can help improve click through rates from the SEPRs, by giving the user more information about the topic of the page. Description tags should include the primary and secondary topics of the page (in a conversational sentence), be unique for each page, and be around 160 characters.

What We Found on Denver’s Government Website
seo meta description
Missing: 254
Duplicate: 14,990
Over 156 characters: 377
Below 70 Characters: 14,330

SEO Insight
What we found was a large number of duplicate description tags, as well as a large number of under-utilized tags.

These should be rewritten, or in the case of the under-utilization be more inclusive, to represent a more focused outline of what each page is about. Depending on the database structure of the website, this can usually be done dynamically for each page.

Meta Keywords

Meta Keyword Tags are not used to help in ranking a webpage, and can actually cause harm if the page over-uses them, or a website duplicates them across a large number of pages.

What We Found on Denver’s Government Website
seo meta keywords
Missing: 254
Duplicate: 14379

SEO Insight
What we found was alot of duplication of these tags, which can be causing negative value signals to the search engines.

A simple way to create these tags (if you need to use them) would be to take the title tag, description tag, H1 and all H2 tags, strip out stop words (and, the, it, etc) and create a list of keywords from the remaining words.

H1 Tags

Google uses the H1 tag to help understand the primary topic of a page, thus H1 tags should represent the primary topic of each page.

They should be short and concise, unique per page, and there should only be one H1 tag for each page on a website.

What We Found on Denver’s Government Website
h1 tags
Missing: 1150
Duplicate: 14270
Over 70 Characters: 132
Multiple: 1305

SEO Insight
What we found was a large number of duplicate H1 tags, and even some pages that were missing them all together. In the case of duplication, this can cause user and search engine confusion; and in the case of them missing, it presents a missed opportunity for increasing ranking value.

These should be rewritten to better represent the topics of each page.

H2 Tags

H2 tags represent the sub-topics of the H1 tag. Where as there should only be one H1 tag, there can be multiple H2 tags (sub-topics) of the primary topic of the page.

What We Found on Denver’s Government Website
h2 tags
Missing: 1123
Duplicate: 12990
Over 70 Characters: 421
Multiple: 12320

SEO Insight
The major concern we found with this variable is that there were a large number of duplicate H2 tags, which might be stemming from the issue with the duplicate H1 tags.

If they are consistent with the duplication of the H1 tag concern, I would first fix the H1 tags, and then rewrite the H2 to better represent the sub-topics of each page.


The final onsite SEO variable we looked at for was image use on the site. Images play an important role with both users and search engines, and should be optimized for both.

What We Found on Denver’s Government Website
seo for images
All images: 2421
Over 100kb: 371
Missing Alt Text: 1631
Alt Text over 100 characters: 3

SEO Insight
Of the images we found, there were a large portion of them that were missing alt text. Alt text not only helps users, but it gives Google another value signal that helps them determine topic relevance for a web page.

Each image should contain a short description of the image as its alt text. If this can be related to the page topic that’s great, if not, I would not force keywords into it.

Share on Pinterest