Technical SEO Services

59% of search engine optimization specialists say technical SEO is the most effective on-site strategy”

Ranking high in search engine results is the goal of any business. As sites grow from ideation to a physical site, it’s not uncommon to invest in advertising, PPC, backlink acquisition and off-site optimization. Over the years, as more content is added, site structure changes and the site grows, small issues arise.

Technical SEO, Website Traffic

A lot of marketing teams will install Yoast SEO, optimize their meta description, content and titles and ignore technical SEO issues aside from site speed.

On-site optimization is in your full control, allowing you to optimize for a ranking factor, including site architecture, structured data, duplicate content, XML sitemaps, crawl errors and multiple other factors

When we provide full, hands-off Technical SEO services to our clients, they often ask: What is it, and what’s involved?

What is Technical SEO?

A site’s on-page factors and server are considered elements of technical optimization. Google and other search engines send crawlers to a website to index its pages. The aspects of this form of optimization focus on:

  • Indexing
  • Crawlability

Technical SEO is broad, covering a variety of different aspects, from issues with an XML sitemap to canonical tag issues, rich snippet usage, schema markup, meta description issues and more. 

With 29% of websites having a duplicate content issue, it’s important to conduct a thorough  SEO audit to uncover key issues of a site that can make ranking for a top search result otherwise difficult.

Why is Technical SEO Important?

Google Search Console sends out alerts because of these technical issues (if your site has been added to the console). Crawlers that scour your site need to be able to navigate the site easily. When issues arise, warnings are sent out because it can impact how your website ranks and behaves for users.

In a hypothetical situation, let’s assume a duplicate content issue was found, and an expert worked on the issue using redirects.

A simple error can lead to the redirects entering a loop where a user clicks on a broken link or backlink to Page 1, is sent to Page 2 and is sent back to Page 1. 

From a user standpoint, this is a bad experience where the site errors and they’re never able to reach the area of the site they desire. Web crawlers will follow these same links and paths, leading to your page never being indexed.

Running a tool like Screaming Frog can help locate and resolve some of these errors with redirects, but manual quality testing can miss these issues. Correcting a technical SEO issue can mean:

  • Better user experience
  • Easier site crawling
  • Higher indexing rates
  • Fewer Google Search Console errors
  • Optimal search engine ranking

User experience is paramount – you want users to purchase your products or services – and on-site technical issues can lead to lost customers. An issue impacting page speed can reduce conversion rates by 12% for each additional second it takes to load a site.

Elements of a Technical SEO Audit

A technical SEO checklist will cover many different items to check. For some of these items, we use tools like Screaming Frog to help speed things up, but many items require a manual approach. While there are a lot of elements our specialists will look at, some of the most common include:

Javascript

Making pages interactive is great when the right Javascript coding is in place. This coding runs on the client-side, so it’s executed in the browser, leading to little-to-no website speed lost in the process.

Search engines have gotten a lot better at rendering Javascript in recent years, but it can still have negative ramifications.

Google outlines best practices for Javascript SEO, but common issues we come across are:

  • Improper  HTTP status codes
  • Issues with pagination 
  • Blocking JS file crawling
  • Hash being used improperly or used with Hashbang (#!)

Javascript is now very search engine-friendly, but a simple error or usage mistake can cause issues on a site.

XML sitemap

A sitemap helps crawlers by providing a roadmap to follow. Pages on your site, which may not be on your navigation bar or have an interlink, can be found if they’re in your XML sitemap. Errors can arise on different servers, such as an Apache server, when rewrite rules are not in place or on NGINX servers.

Some sites don’t even have a sitemap.

Search engines will look for a sitemap. If one isn’t present or there’s an error, you’re reducing your chances of key pages being indexed. Google Search Console will alert you to issues when accessing your sitemap.

Site architecture

Three clicks is all it should take for anyone visiting your site to reach any page. Architecture issues impact usability and can make it more difficult to reach vital portions of your website. 

A site audit will help identify key issues users may experience when navigating your website.

URL structure

SEO-friendly URL structures allow users and search engines to quickly know what a page is about by looking at the URL. Errors during implementation can lead to:

  • Multiple pages of the same content
  • Numerous versions of the same site
  • Missed optimization opportunities

Structured data

The use of structured data markup allows search engines to know more about the content on a page. Rich snippets, for example, are a great use of this information and something we try and implement for all of our technical SEO clients because these snippets increase clicks.

Validation errors or not using appropriate data can lead to lost traffic and revenue.

Thin content

Content should be on every technical SEO checklist. Thin pages have little to no value. Many of these pieces were created with the sole intention of ranking for a specific keyword. 

Identifying, updating or removing these posts is recommended.

Google is known to manually apply a thin content penalty that can impact your organic traffic. If this content has a strong backlink portfolio, it may be worthwhile to update the content, optimize the meta description and the title. See if it starts ranking higher in the search results.

Duplicate content

Creating informative articles and posts is a good thing, but a duplicate content issue isn’t beneficial to your site or its visitors. Simple mistakes, such as allowing a URL with “page/post” and “page/post/” to go to the same page can occur.

Cleaning this content up leads to a better user experience and should be part of every site audit.

Hreflang

Localized searches are important, especially for important pages that may be in another language. These tags can also show a relation between pages. For example, if a person searches for a query with an IP address from Spain, using the hreflang=”es” tag would display this page on the search engine results page (SERP).

Errors with this tag can lead to issues with the right content not being displayed for your audience. Higher bounce rates and lower dwell time can negatively impact your SEO success.

Canonical tags

A technical audit will look for issues with indexing and markup, and it will also look for proper canonical tag usage. This tag is designed to help correct duplicate content issues. For example, when two products on an e-commerce site are similar, the product description may be almost or exactly the same.

This can occur when new versions of a product are released.

Proper usage of this tag will allow Google to choose and crawl one of the canonical versions of the page, which can then be the focus of your SEO strategy.

404 pages

Crawl errors should be corrected to ensure search crawlers are able to index as much of your site content as possible. When there are 404 errors, important pages may not be indexed, the user experience suffers and it can lead to a broken link.

A site audit will identify pages that produce 404 errors.

When an error is found, a simple link removal may be warranted, or if the page holds any SEO ranking or authority, we may use redirects to correct the problem.

Redirects

As a technical SEO specialist, one of the biggest issues we come across is redirects. Somewhere along the line, internal linking goes awry and a page that doesn’t exist any longer or that never existed is trying to be accessed.

A quick fix for this, and one that works well in terms of SEO performance, is to create a redirect. 

Multiple errors can occur in the process:

  • Loops
  • Chains
  • Use of the wrong redirect type 

Because a variety of redirects are available, it’s important to use the right type:

  • 301 – the most common choice, which indicates that a site or page has moved permanently
  • 302 – a “temporary” redirect that indicates the original page will return
  • 307 – a “temporary” option, similar to 302 redirects, that is used during internal testing
  • 308 – another option, which is treated the same as 301 redirects

You can also cause redirecting with JavaScript, HTTP headers and meta refreshes. It’s simple to add an extra character or create a redirect loop that’s overlooked and only found during a technical SEO audit.

Meta robots and X-Robots-Tag

Technical optimization has to look beyond a title tag or alt text. It needs to consider crawling issues. One error that is commonly found during site audits is that a meta robot or X-Robot-Tag is set to noindex, nofollow.

When this tag is used improperly, it impacts search engine ranking because you’re explicitly telling search engines not to index a page or follow some of your internal linking.

If important pages aren’t indexed, all of your SEO performance will suffer for that page. Correcting these easy-to-overlook issues can help increase organic traffic.

Robots.txt

A Robots.txt file works in the same way as meta robots. In this file, you can allow or disallow the accessibility of a page. Proper usage of this file gives you control over which pages crawlers and certain user-agents are able to access.

You can also set a crawl-delay to ensure that your site speed remains high and servers don’t get bogged down when a site is being crawled. A technical SEO audit can identify common issues with robots.txt to maximize indexing.

Are on-site issues impacting your search results? 

Contact us today to speak to a technical SEO specialist that can provide a full audit of your site.