Top Tips For Improving Your Technical SEO

Top Tips For Improving Your Technical SEO

Technical SEO is the real nitty gritty part of organic search engine optimisation, and it often times the aspect that sets you apart from your competitors and gives you an edge. But it can be quite easy to get lost in the inner workings of your website and suffer from little return on your investment of time and effort. It’s important in this case to prioritise your tasks and accomplish your goals.

We’ve broken down Technical SEO into manageable chunks for you, so that you can get into the routine of best technical SEO practice. However, don’t forget to have a solid on-page and off-page SEO strategy in place also, as purely focusing on technical SEO won’t net you many results either. With that being said, lets dive right into it:

1. Indexing and Crawlability

In a nutshell, if your pages can’t be crawled, they can’t be indexed and if they can’t be indexed, they will never be found on Google. It’s important to remember that the Google bots are exactly that – robots. They have a sole purpose and little intelligence outside of crawling a website and reporting findings back to the mothership. You will want to give them as much assistance as possible and ensure that your website structure is easy to navigate and your content is easy to digest.

Make all of your essential pages can be indexed

This is a no-brainer. You want to make sure your most important pages can be crawled and indexed by Google. You can check the indexation status of your website by entering site:domain.com into a search, use an external SEO tool (such as Screaming Frog) or via your Google Search Console in the Coverage section.

Optimise your Crawl Budget

Google bots only spend a set amount of time on your website and crawl a set number of pages. This is known as a ‘crawl budget’. You can view this in Google Search Console under the Crawl Stats section.

We’ve come across many circumstances where important pages aren’t being crawled or indexed because Crawl Budget is being used up on other resources and pages that aren’t as important. In order to rectify some Crawl Budget issues, you can start by:

  • Eliminating duplicate content and pages.
  • Restricting indexation of pages such as terms and conditions, privacy policies, and outdated promotions (in other words, pages with no SEO value).
  • Fixing broken links and redirect chains.

You will also want to invest time and effort into building your backlink profile.

2. Structure and Navigation

Good site navigation pleases both bots and real users; the latter being a huge factor in being successful in the Google search results. Clean sitemap, solid site architecture and clear pagination are all good factors here, improving both user experience and crawlability.

Assess your Sitemap

It’s easy for a sitemap to go out of date and be left untouched for quite some time. But a sitemap is what assists search engines in viewing the structure of your site and helps them discover the fresh content contained. You will want to keep your sitemap clean and updated, without any broken pages, redirects or no-index URLs.

Audit your internal links

It’s vitally important to keep your click-depth shallow and add relevant and accurate anchor text to each of your internal links, clearly indicating where it will send users. You will also want to fix any broken links and have as few redirects as possible in order for search engines to understand your website’s context easily.

3. Site Speed

This is generally an umbrella term for various technical changes that will improve the overall speed of your website and how quickly assets load on the page. Unfortunately there is no one-click solution, but it has become vitally important to address over the past few months. There really isn’t any downside to improving your site speed, and whilst it can get quite complicated, there are a few basics that you should definitely cover first:

Limit redirects

When you must use a redirect, use a 301 for anything permanent and a 302 for a temporary redirect. No page should have more than one redirect, but you should use them sparingly regardless.

Utilise Compression, Minify your Resources and optimise your images

Use a tool like Gzip or Brotli to compress your website data and reduce file sizes; this really depends on how your website and server are set up. You can also minify CSS, JavaScript, HTML, images and videos – platforms like WordPress have plenty of plugins that will do this all for you.

Reduce server response time to less than 200ms

Using HTTP/2 can give your site a site a performance boost and enabling OCSP stapling can speed up your TLS handshakes. You can also improve site speed by leveraging resource hints and by supporting both IPv6 and IPv4.