Demystifying Technical SEO. Your Geek Speak Free Guide To The Unsexy Side of SEO!

Apr 15, 2023

Technical SEO is one of the major pillars of Search Engine Optimisation, and as unsexy and downright unattractive as it might sound, it really is something you need to get familiar with.

Technical SEO is all about getting your website crawled and indexed and involves getting to grips with things like site speed, mobile friendliness, response codes, site security, canonical tags, and structured data. 

But these terms can sound confusing and just too technical to contend with, so today, I want to break down some of the most important technical SEO jargon so you can understand what’s going on and why it’s essential for you to get it working.

Crawling and Indexing

So let’s start off with crawling and indexing. Crawling and indexing are the two primary functions of search engines. Crawling refers to the process of search engines scanning a website's pages, following links, and indexing them in their database. 

Crawling

So search engines like Google use web crawlers, or bots, to analyse the content of your website. 

Google sends its Googlebot out into the ether, and when it comes across your website, it heads on in and has a browse around. Googlebot looks at all your website’s pages, links and metadata (your page titles, descriptions, keywords and so on) and builds up a picture of your website and what it’s all about.

It’s really important to ensure your website is easy for the crawlers to navigate. You can do this by, you guessed it, getting your technical SEO in good shape! 

Amongst other things, you need to ensure your site has a nice straightforward structure, isn’t filled with hard-to-understand duplicate content, has no broken links and has a sitemap.  

But more on that later.

Indexing

Once your website has been crawled, the bots then take their haul of information back to their search engine’s HQ, where the information is indexed.

The indexing process involves taking the information the crawlers have found about your website and organising and storing it in their vast database.

When someone comes searching for an answer to their most burning question, Google goes to its index of content on what it’s being asked about and serves them in the search results.

But not all pages make it into this index. And if your pages aren’t making it into Google’s index, they’ll never be served up in the search results.

So what stops pages from being indexed? 

Well, you can block pages you don’t want showing up in the search engines from being indexed by editing your robots.txt file.

But your pages can also be blocked from being indexed due to errors, issues with canonical tags, duplicate content, and even insufficient content.

Again, we’ll talk more about these in a minute.

XML Sitemap

I mentioned that it’s important to have a sitemap when I was talking about getting your website crawled, but what is a sitemap, and how do you get one?

Well, a sitemap is a file that lists all the important pages on your website, and it helps the crawlers work out which pages on your website should be included in their index. 

While we’re in technical SEO mode, we’re talking about XML sitemaps. XML is a type of code understood by crawlers, so it’s not particularly pretty to look at, but the bots get it.

An XML sitemap is written using XML code, set out in a specific format, and lists all the URLs on your website that you want to be indexed.

Your sitemap is usually automatically generated, and they’re really easy to find.

Just type https://domain.com/sitemap.xml into the search bar, and your site map will pop up. 

Robots.txt

I’ve also mentioned robots.txt files, but again, you might be wondering what they are and how you find yours.

So, the first thing a crawler will look for when they land on your site is your robots.txt file. It’s the rule book for your site as far as crawling goes.

This rule book tells the bots whether they can crawl your site or not, and which pages shouldn’t be indexed using “allow” and “disallow” instructions.

They’re great for making sure pages that aren’t meant for public view don’t get crawled and indexed, which also saves you crawl budget.

You can find your robots.txt file by entering your website followed by /robots.txt, so, https://domain.com/robots.txt

Site Speed

Site speed is a crucial factor in technical SEO. 

Google says that site speed is one of the signals used by its algorithm to rank pages. A slow-loading website can negatively affect user experience, leading to higher bounce rates and lower conversion rates.

If you want to test your site speed, you can use Google’s PageSpeed Insights tool.

You can improve your site speed in several ways, such as optimising your images, using a content delivery network (CDN), minifying HTML, CSS, and JavaScript, and leveraging browser caching. 

If all that sounds like a load of gobbledygook, make a start at improving your site speed by optimising your images.

Mobile Friendliness

Mobile friendliness is another important factor in technical SEO. 

With almost 60% of online searches done on our mobiles these days, it’s essential our websites perform well when our visitors are on their phones. 

In fact, Google has a mobile-first policy which uses the mobile version of your website for indexing.

To make sure your website is mobile-friendly, you need to use responsive web design, which automatically adjusts your website's layout to fit different screen sizes. 

You can test to see if your website is mobile-friendly by using Google's Mobile-Friendly Test tool.

Response Codes

Response codes are messages from servers that show the status of a webpage. Basically, they show you whether or not your page is working properly.

Screaming Frog is a great place to see the response codes for each of your website’s pages.

200 Response Code

If your pages show a 200 response code, they are all fit and healthy and need nothing from you. You want as many 200 response codes for your site as possible.

301 Response Code

301s are good too. 

You’ll see a 301 code when a page on your website has been moved permanently and has a 301 redirect status.

If you ever want to replace a URL with another URL, you’ll want to set up a 301 redirect. This means bots will crawl and replace the old URL with the new one, and the new one will get crawled and indexed.

The really important thing about setting up a permanent redirect, i.e. a 301 redirect, is that any link juice you had on the old URL will get passed to the new one.  

302 Response Code

302s, on the other hand, aren’t so good. Mostly because they don’t pass link juice on.

They are temporary redirects, and the crawlers don’t particularly like them.

If you’ve redirected a page or moved things around, you really should stick with 301s.

404 Response Code

We’ve all seen a 404 page. They show up when the link you click takes you to a page that’s not there any more or the link to that page is broken.

You’ll need to fix those broken links asap, as too many 404s can damage your rankings.

It’s also worthwhile setting up a custom 404 so if visitors do hit a brick wall when trying to find the content they wanted on your site, they can click on a button to contact you or go to your homepage a try again.

Site Security

Another critical thing for technical SEO is your site security.

It’s really important these days that your site is secure, and you can do that by making sure your site uses the encrypted version of HTTP.

Basically, this means that your website needs to start with https://www and not http://www if you want to appear in the search engines.

You get your website to use HTTPS by making sure you have an SSL certificate. Most domains come with an SSL certificate these days, but you can check to make sure yours is all up to date in a couple of ways.

First, you can click on the little padlock to the left of a domain in the search bar and check that your certificate is valid.

Or you could use an SSL certificate checker.

Either way, make sure yours is all up to date.

Canonical Tags

You’re nearly there! Just a couple more things to cover.

First, let's look at Canonical tags.

Canonical Tags are tiny bits of HTML code that tell the crawlers which version of a webpage is the one it should take back to HQ to index.

Canonical tags are super helpful in preventing duplicate content issues from negatively impacting your rankings.

If you have multiple pages with similar content, use a canonical tag to indicate which one you want to be crawled and indexed. 

Canonical tags are super important for e-commerce sites where product vary very slightly.

Structured Data

And finally, structured data.

Structured data is a way of setting out your website so it’s easy for the crawlers to understand.

You can describe your site using a special vocabulary called Schema.org that the big search engines understand, which basically takes the content of your site and translates it into code that the search engines can easily process.

Making your content easy for the search engines to understand makes it more likely they’ll show your content and helps you show up in something called “rich snippets”.  

You’ve seen them before; they’re what show up on page one of the search results above position 1. 

Structured data is really useful if you’ve got lots of recipes, reviews, product descriptions and videos on your website. 

If you want to explore it more, there’ll most probably be a plugin or an integration you can use on your website to make things easier.

Demystifying Technical SEO - You’re All Done!

So, there you go! You’ve been on a tour of technical SEO, and you’ve survived.

Well done!

The next thing for you to do is to head over to an SEO tool like SEMrush or Ubersuggest and do a technical SEO audit on your site.

You’ll get a list of issues you can fix up on your website to get your technical SEO working well so your website gets crawled and indexed and you show up in the search results!

If you have any questions or need any help, please drop me a comment or ping me an email.

Contact Sarah

Click here to leave me a message. I'll get back to you as soon as I can. thank you. 

We hate SPAM. We will never sell your information, for any reason.