What is Technical SEO?: The Ultimate Guide to Maintaining Your Site's Health

technical seo picture
Just as every human requires regular check-ups and medication to maintain good health and carry out daily activities, a website needs regular health check-ups to perform well on search engines. This is where technical SEO comes into play.

Technical SEO involves optimizing various technical elements of a website, including site structure, page speed, mobile-friendliness, crawling and indexing, schema markup, and more, to improve its search engine visibility and ranking.

Why Technical Inspection?

When a website is newly created with a correct structure, it often starts without any defects. However, over time, various factors such as oversized images, broken links, and other site elements can gradually diminish its effectiveness. These issues can lead to sluggish loading times and content shifting, ultimately impacting the user experience. Therefore, it's crucial for website owners to regularly monitor and maintain their sites to ensure optimal performance and usability. This involves addressing any issues promptly, optimizing content, and keeping the site's structure up to date to prevent deterioration over time. By staying vigilant and proactive in site maintenance, website owners can mitigate potential problems and maintain a strong, reliable online presence.

Please take a moment to review these two screenshots from an audit I conducted using PagespeedInsight to evaluate the performance of this website on both mobile and desktop views.

website performance on pageSpeedInsight

website performance on pageSpeedInsight
Based on these images, what performance issues did you notice?

Those are the 4 major components that indicate is your site is doing well or not.

Although the website is partially satisfactory, there is still room for improvement. All pages of the site need to perform well with an audit score of at least 80%.

In this article, we will explain how to address these issues and optimize websites for better performance. Sit back, relax, and enjoy

As a website or online store owner, you don't want your site to go unnoticed on search engines like Google. To ensure proper crawling, indexing, and understanding of your site by search engines, you need to maintain its health according to acceptable standards. 

This article aims to explain the importance of technical SEO, the necessary steps to make your site user-friendly and crawlable, and the tools you can use to achieve these goals.

The article is a bit lengthy. So, I have used a table of contents to categorize each section. You can you it to switch to whichever part you love to know about. If you are new to all this stuff, I recommend you to carefully read through it. Enjoy reading!

Table of Contents

Importance of technical SEO

1. The ultimate goal of technical SEO is to enhance the user experience and search engine visibility of a website, resulting in increased traffic, engagement, and conversion rates. 

2. By optimizing the technical aspects of your site, you can help search engines crawl and index your pages more effectively, leading to higher rankings and improved organic traffic. 

3. Technical SEO also helps businesses achieve their digital marketing goals by attracting more organic traffic and improving the user experience.

Best Practices for Technical SEO Optimization

Optimize page load speed site performance

page speed performance


If websites have slow-loading pages, it can negatively impact user engagement, ultimately leading to lower search engine rankings. Google uses user experience as a ranking factor, so if your website has a high bounce rate, it may be assumed to be unhelpful to users, which could result in a decline in page rank.

Tips to improve page load speed and optimal site performance

1. Optimize images: Serve images that are appropriately sized to save cellular data and improve load time. Ensure that the image you serve on your page is not larger than the version that's rendered on the user's screen. Anything larger than your user screen will be inappropriate and will slow down your page speed. Compress and resize images to reduce their file size without compromising on quality

Another thing to note is that images served with the next-generation format in view are likely to do well. Image formats like WebP and AVIF often provide better compression than PNG or JPEG, which means faster downloads and less data consumption

recommended way to add image on your website



2. Optimize and remove unused code: Unused CSS and JavaScript can slow down your page load speed. By removing unused code, you can improve your website's Core Web Vitals. Largest Contentful Paint. The unused code can introduce unnecessary delays in resource load time, memory usage, and main thread activity that contribute to poor page responsiveness. DevTools makes it easy to see the size of all network requests

To analyze the bundle,

  • Press Control+Shift+J` (or `Command+Option+J` on Mac) to open DevTools.
  • Click the Network tab.
  • Select the Disable cache checkbox
  • Reload the page
  • After the reload click the coverage tab to know the code that is being used and the useless ones

For example, take a look at the image below

how to check unused code on Google develops tool
From this image, it is clear that the unused code is way larger than the used ones which may make this page to be slow when visited.

Need help removing unused code? Check this article - how to remove unused code by web. dev community.

3. Reduce server response times (TTFB): Slow server response times can have negative effects on your website's performance and online presence. This can result in frustrated visitors and a higher bounce rate.

To ensure that your website is performing optimally, it's important to monitor server response times. Ideally, your server should respond within 600ms (0.6s) to provide a smooth user experience. Anything longer than this may lead to slower page loading times and a negative impact on your website's performance

Possible fix to longer server response time

  • Optimize the server's application logic to prepare pages faster.
  • Upgrade your hosting plan or server to a faster one if necessary.
  • Upgrade your server hardware to have more memory or CPU
  • Block crawling of duplicate content on your site, or unimportant resources (such as small, frequently used gIsn'ts such as icons or logos) that might overload your server with requests

Use structured data markup and navigation on your site

example of site structure data

Structured data is the foundational pathway that informs search engines about how your website is structured. Google and other search engines use it to crawl every web page. Without structured data (or when not used correctly), Google might find it difficult to crawl your website accurately.

What then is structured data?

Structured data is information that has been arranged in a specific format, which facilitates easy searching, processing, and analysis.

The way your website is structured is very important. Please take a moment to review the recommended article structured data format provided by Google below.

Google structured data format


What did younoticen'tn'tt the page was properly organized? yes, it is. 

Adding structured data can enable search results that are more engaging to users and might encourage them to interact more with your website, which is called rich results


For example, an eCommerce website that is structured correctly will display like this when customers enter the search query on the Google search

how breadcrumb work on google search


The rich result will help them to discover relevant information on the page like breadcrumbs, price, ratings, and so on which can help them to quickly get what they want

Make use of breadcrumbs 

how breadcrumbs work in websites


Yes, that's correct! Breadcrumbs not only assist users in navigating a website by displaying their location, but they also provide internal linking to categories and subpages on the site. This internal linking can help to reinforce the site's architecture and improve its indexability by search engines.

example of how google uses URL as breadcrumbs in searches


Additionally, Google has recently introduced breadcrumb-style navigation in search engine results pages (SERPs). This means that the URL structure of a website can be displayed in breadcrumb format within the search results, making it easier for users to understand the context of the page and how it fits into the site's overall structure. This can lead to higher click-through rates and better user engagement.

Adding site maps to search Console

Although search crawlers can easily crawl web pages that are internally linked together, it is recommended to have a site map for your website. 

A sitemap is a file that provides information about the pages, videos, and other files on your site, as well as the relationships between them. Site maps help search engines like Google understand the important pages you would like them to crawl. They are especially useful for websites that have many pages, such as shopping websites or newly created websites.

According to Google Developers Docs, a website with 500 or fewer pages may not require a sitemap, especially if it doesn't have many media files like images, videos, or music. A site that is properly linked together may also not need a sitemap, as Google can easily crawl it. However, if you feel that some pages on your site are not being reached by Google crawlers, you may add a sitemap to your site.

Creating site map

There are many free sitemap generators you can use to create site maps. You may need to do a Google search to find the best for yourself. Ensure that you choose an XML site map generator because's what Google accepts Also choose the one that can generate all the links on your site.

For developers who may want to create it from scratch, please, refer to this Google guide on how to build and submit a sitemap.

Submitting site map

If you created your sitemap, go to Google Search Console, select a property, on the sidebar, and click on sitemap

( If you use other search engines like Microsoft Bing, you may need to visit their DUseo to know how you can add a sitemap)
locating sitemap on the Google search console

then, copy and paste your sitemap into the input field

submitting sitemap on the Google search console

Ensure that help receives a success message because it is only a successful sitemap that can be crawled by search engines.

  It is recommended to have at least 10-15 pages on the sitemap you submitted for it to be crawled. 

 

 Help Google crawl and index your site pages


In essence, Google Bot is an automated machine capable of crawling every page on the web without any manual instruction. However, there may be instances where it cannot crawl your page due to your directive not to do so. Google only indexes text, images, and videos that Googlebot is allowed to crawl.

FactorGooglen prevents Googlebot from crawling your site pages are:
  • Disallow crawling with robots.txt
  • Use a non-index rule tag on your content.
  • Authenticated/ Password-protected pages

Robot.txt file

The robot .tx helps Google to understand the pages you want it to crawl and the ones that you do not want it to crawl. It has two major commands - Allow and Disallow. 

Presented below is a straightforward robots.txt file containing two rules:

User-agent: Googlebot Disallow: /Googlebot/ User-agent: * Allow: / Sitemap: https://www.example.com/sitemap.xml

The Disallow rule above tells the user agent which is Googlebot not to crawl the pages  

Disallow googlebot not to crawl your website


while the allow rule beneath it permits it to do so. 

Allow googlebot to crawl your website

Notice the asterisk sign at the front of the allow rule. It is used to tell Google to crawl everything on your website

Tell google to crawl everything on your website

If there are some pages you do not want Google bot to crawl you will need to specify them. for example, like this

inform search engine not to crawl some files on your websites

Locate, edit, and test your robot.txt file

A robot. txt file lives at the root of your site. So, for example,  the site http://www.example.com, the robots. txt file lives at http://www.example.com/robots.txt

If you are utilizing a Content Management System (CMS) such as Blogger, WordPress, or Wix, you may not be able to directly edit the robots.txt file. Instead, your CMS might provide a search settings page or other means to inform search engines whether to crawl your page or not.

For example, Blogger users can find theirs in sittings. Login to your account → Go to settings → Crawlers and Indexing → Costume robot.txt

To see if your robot.txt file is working as intended, visit the Google Robot Tester page to check it out. If you need help creating a fresh robot.txt file, kindly refer to their documentation

Non-index rule

If your website or any of its pages has a non-index tag, Google will not index that site or page. The non-index tag instructs Google not to index certain pages. 

Password protected pages

If you have certain pages that require authentication before they can be viewed, Googlebot will not be able to access such pages. Unless you move them out of the protected area.

Thin and duplicate content Issues

Google and other search engines consider content with little or no value, such as thin or duplicated content, to be spam, as per their content policies.

Having pages with identical or nearly identical content can confuse search engines and lower the visibility of a website's pages in search results.

If a crawler detects that content on your page is repeated more than once, both the original and duplicated content will be marked as duplicate content, and the page may not be indexed.

Finding and rampage duplicated content

Log in to your search console account and select a property. On the sidebar, click pages under the index dropdown

how to check indexing page on google

Scroll a little down on the Page indexing area Find and locate the duplicate content beneath Why pages aren’t indexed

search console indexing page

Here you will notice that there are whooping of duplicate content on this page.

If we open it and inspect the URL, this is what we will get

image showing URL is not on google


But if we do a  live test with one of the links, our result will change

image showing URL is available to google


As you can see, the URL is currently available on Google. What should you do next? It depends. If the page is already indexed, requesting another indexing may not be necessary. However, you will need to use the non-index rule to prevent the URL from being indexed by Google.

Solving dup-duplicated page issues

Duplicate pages can occur unintentionally, such as with a blog post that has a paginated comment section. Each page of the comment section may include the original post, resulting in unintentional duplication. If these pages are indexed by Google, it can lead to duplicate content issues.

Duplicate without user-selected canonical

In that case, you will have to add a no-index tag to every single one of those pages.

Write a crawlable Link

When creating links, it is important to follow the proper guidelines to ensure they are created correctly. Search engines follow links to understand the content on your website, so if a link is not properly written, it may be difficult for search engines like Google to crawl and index your site's pages.

Use the <a> and the href attribute

Google only crawls links that have the <a>  tag and which use the href attribute. This means that if a link is created using a different HTML tag or does not include the href attribute, it may not be crawled by Google

proper way to use link anchor tag and its attribute

Add anchor text 

Instead of leaving your anchor tag empty, it is recommended to add descriptive text between the opening and closing tags to tell both users and search engines about the page being linked to. This helps search engines better understand the context and relevance of the link, which can improve the page's ranking in search results.

link anchor text placement

Keep the text  relevant and concise

Make sure that you don't add excessively long text to your anchor tags, as this could be viewed as link spamming by search engines. It's important to keep the text relevant and concise to ensure that it accurately reflects the content on the linked page. Link spamming can negatively impact a website's search engine rankings and should be avoided.

how to add link descriptive text

Fix or remove broken link

Broken links refer to links on a website that are no longer functional or lead to dead pages, making them inaccessible to both users and search engines. Broken links can negatively impact a website's user experience and search engine rankings. To avoid trust issues with your users and search engines, it is important to regularly check for and fix broken links on your website or remove them if they cannot be fixed. Thiuselps ensure that your website's content remains accessible and relevant, and can also improve its visibility in search results.

Finding broken links

To find broken links, you will need to use a broken link checker tool. There are several such tools available online. For this test, we will be using the Ahrefs tool.

We are going to enter our site URL to see which links are broken

Ahrefs broken link checker tool

Here are we. Let's explain. 

Ahrefs broken link checker

The 'can't resolve host' error message means that the page you are trying to link to is no longer hosted or has been moved, while the 404 error code indicates the content on that page is no longer available.

You should try to rewrite the URL or remove it permanently

HTTPS vs. HTTP request

Which should you use HTTPS or HTTP?

Google values sites that use HTTPS more than those that use the normal HTTP server request. To avoid trust issues with both your users and search engines, it is recommended that you always use HTTPS instead of HTTP.

Let me show you some screenshots of how both work.

When you serve HTTP on a browser like Chrome, it will show up a "Not Secure" key beside the URL of your website

image showing unsecure URL

Sites that show insecurity are not trusted by search engines. This could downgrade your search ranking.

Just imagine a user who wants to visit your website encounters a warning message like the one in the screenshot below. Do you think the user will proceed to browse your site or search for another?


image showing unsecure URL on the user browser


A secure contact Usways has the padlock sign at the back of your site URL.

how to know a secure URL

If your website uses HTTP, it is recommended that you migrate to HTTPS as soon as possible. Doing so will help to ensure that your website remains secure and protected. Check how to migrate HTTP to HTTPS in this article - how to secure your website with the HTTPS protocol

Rapping up

We are glad that you have read this article. As you may already know, technical SEO plays a crucial role in determining search rankings. To achieve the top spot on Google search, it is recommended that you follow all the best practices outlined in this article.

If I may ask, which of the new things you have learned here do you plan to try first? 

If you require assistance, please do not hesitate to reach out to us via the Contact Us form. 

Thank you for reading and we look forward to seeing you in our next article.

Post a Comment

Previous Post Next Post