Technical SEO 101
Home > Blog > Technical SEO
Technical SEO is a fundamental component of any good ecommerce marketing strategy. While conducting keyword research and creating targeted blog posts are important tasks, ultimately, great content won’t get you far if Google and other search engines can’t find, crawl, and index your site. That’s where technical SEO comes in.
What is Technical SEO?
Technical SEO refers to optimizations to a website and server which help search engines crawl and index pages more effectively. The result is improvements in organic rankings, clicks, traffic, and — for ecommerce brands — sales and revenue. These optimizations are a bit behind the scenes, but work to fuel your site's organic growth.
Any good technical SEO strategy begins with figuring out your site's existing weak spots and opportunities for improvements, and that starts with a technical SEO audit by an experienced SEO agency.
Technical SEO vs. On-Page SEO vs. Off-Page SEO
The world of SEO and the tasks related to optimization can be broken down into 3 main parts: technical SEO, on-page SEO, and off-page SEO. All three are important for a strong and effective SEO strategy.
Let's briefly cover what each means.
Technical SEO
Technical SEO is largely within your control as a site owner. That said, certain website providers may limit access to some technical SEO optimizations, making it more difficult to customize your site as needed for optimal SEO.
It's essential to understand the capabilities and limitations of your site's setup so you can make better decisions on which hosting providers and platforms work best for your site's goals. Even with access to many of the areas you would need to optimize for technical SEO, it's still pretty difficult to master without experience, which is why we always recommend partnering with qualified technical SEO experts.
On-Page SEO
On-page SEO refers to the content you deliver to the search engines when you publish it. This tells the search engines what your page is about. It includes your headings, content, images, alt text, meta tags, URLs, and linking. You have the most control over this facet of SEO because it is literally everything you create on your site. When you publish a blog post or edit content on a product page, that is on-page SEO.
Off-Page SEO
Off-page SEO or off-site SEO refers to how much of an authority your content is compared to other pages. It is essentially how well-known your content is from around the web and is generally measured with backlinks. Think of this like a vote of confidence from the internet about the quality and usefulness of your site and its pages.
Note that we said quality and not quantity! Google rewards sites with quality backlinks from reputable and relevant sources. A good off-site strategy involves leveraging existing partnerships and performing outreach within your niche's online community. Do this well and you'll gain authority. However, spamming links to your site may result in penalties and loss of rankings. Proceed with caution.
Why is Technical SEO Important?
Technical SEO directly impacts your site's crawlability, indexability, accessibility, rankability, and clickability. In other words, it is extremely important for brands looking to maximize their organic traffic, sales, and revenue.
If your website is lacking in its technical SEO foundations, it will be more prone to the negative effects of frequent Google algorithm updates, and may not communicate with search engines properly. Ultimately, the downside of ignoring technical SEO is not being found on the internet — and for brands that rely on internet sales, that can be detrimental to success.
In short, technical SEO helps search engines find and understand your web pages more easily. So, how exactly does it all work? Below, we'll break down the fundamentals of technical SEO, including page speed, sitemaps, canonical URLs, schema markup, duplicate content, and much more.
Understanding Technical SEO Fundamentals
In order of importance, the technical SEO foundation consists of:
Crawlability - The ability for search engine spiders to crawl and read your site's content.
Indexability - How easy it is for search engines to crawl and learn what your site is about.
Accessibility - How well your site is delivered, including page-speed, crawl depth, and server performance.
Rankability - How well-optimized your on-page content is.
Clickability - How well your page content reflects the search engine user's intent.
Crawlability
Secure Your Site with an SSL
SSL, or Secure Sockets Layer, is a technology that keeps internet connections secure and protects any sensitive information, such as contact or payment details. When you see a domain beginning with “https://” rather than “http://” you know that site has SSL set up.
Information shared on websites with SSL certificates is much less likely to be hacked. Not only do users prefer more secure sites, but Google and other search engines do too. In 2014, Google officially announced that SSL was a ranking factor for websites.
If your site doesn't use SSL, setting it up and migrating all pages from http to https will take a lot of work — but the payoff will be well worth it.
Optimize Your Page Speed
Don't underestimate the importance of page speed on SEO. Users don't want to wait around more than a few seconds for a page to load; when it takes longer than this, bounce rate increases dramatically. Site speed is also a ranking factor for search engines, so a slow-loading site could hurt your organic traffic. Every second counts!
There is a lot that can be done behind the scenes to optimize page speed, such as compressing images, using a CDN (content distribution network), cleaning up code, auditing plugins, and using asynchronous loading. Again, optimizing page speed isn't something that happens in a single day — but the efforts can significantly improve your technical SEO performance.
Create an XML Sitemap
An XML sitemap functions as a map of your website for search engine crawlers, making it easier to understand the content of your site. This file is especially important for large websites with many web pages. It's important to keep your XML sitemap up to date, to communicate any changes to your site to the search engines.
Canonicalize Similar Content
Similar or duplicate content can hurt your crawlability and indexability by confusing bots about which URL is the original, "correct," or most important one. If search engines end up indexing the wrong page, it can hurt the other page's rankings and traffic.
Canonical tags are an important tool in search engine optimization, because they allow you to tell the search engines know which version of a webpage you want to appear in search results.
Optimize Site Architecture
The way your website is structured and organized can have a significant impact on organic rankings. Why? Sites with optimized site architecture are easier for search engines to find, crawl, and understand. When laid out well, your site structure also tells search engines which pages are most important.
As a general rule, important pages should be linked from the home page and throughout the site. Additionally, related pages should be grouped together. Updating your site structure can be a big undertaking if you have a lot of pages, but it will ultimately benefit your site's SEO health.
Follow a Consistent URL Structure
As the name implies, "URL structure" refers to how your website's URLs are structured. Well-optimized URLs should feature that web page's target keyword, should be specific and descriptive to that page, and should avoid unnecessary words such as prepositions.
Subdirectories and subdomains are also an important part of technical SEO optimizations. Subdirectories function as a specific pathway on your site (for instance, yourwebsite.com/blog) whereas subdomains compartmentalize a part of your website for a specific purpose (for example, blog.yourwebsite.com).
Utilize Robots.txt
Also known as the robots exclusion protocol, robots.txt is a type of text file that tells search engine crawlers which pages and files they are allowed to crawl, and which they are not allowed to crawl. Robots.txt can protect your site from malicious bots, and also keep good bots from spending up your crawl budget on pages that aren't very important.
In other words, utilizing robots.txt can make it easier for search engines to find and crawl your site's most important pages by cutting through the clutter.
Add Breadcrumbs
Breadcrumbs function as a trail that help users find their way back to the home page of your site. They demonstrate the relationship between the page they are currently on (such as a product page, in the case of ecommerce sites) and a higher level page that links to that page (like a category page).
But breadcrumbs don't just help users navigate your site — they help bots, too, which is what makes them an important part of a technical optimization.
Paginate Your Content Vs. Infinite Scroll
Pagination is a process that separates related content out across multiple different pages. In terms of search engine optimization, pagination is important for telling search crawlers when pages with distinct URLs are related to one another. This is done using code on the back end of the website.
Indexability
Unblock Search Bots
As mentioned above, robots.txt can be used to tell search engine bots which pages not to crawl. While you're handling this task, it's also well worth your time to ensure that search bots are not blocked from important pages you want to be crawled.
Remove Duplicate Web Page Content
We referenced this above, but it's worth stating again: duplicate content can hurt your rankings. If you have multiple versions of the same page on your site, search bots can become confused about which page they should be indexing (and also waste your crawl budget by crawling duplicate content). Canonical tags can be used to address this issue.
Optimize Your Redirects
Broken links and incorrect redirects can also confuse search bots and make it harder for them to properly crawl and index your site. Reviewing all redirects should be part of any technical SEO site audit.
Check Mobile- Responsive Web Pages
Mobile-first indexing has been around for a while now, meaning that your site's mobile experience is even more important for your rankings than the desktop version. When it comes to SEO, having a mobile friendly site is non-negotiable.
It can be easy to miss issues on a mobile site during a regular site audit, but tools like Google Search Console’s Mobile Usability report can help you identify web pages that aren't optimized for mobile users.
Fix HTTP errors
HTTP error codes and status codes are issued by servers as a means of sharing information about how certain requests were handled. You have likely come across the common 404 "page not found" error before; this is one type of http error among many.
HTTP errors can be harmful to the user experience (after all, it's frustrating when you are searching for something online and land on a broken page) but they can also hurt your technical SEO health by blocking search bots from your web pages.
Addressing web page errors correctly is important, and because each HTTP error has a unique resolution, we always recommend hiring a technical SEO expert to handle these issues. Below are the main types of HTTP errors you may encounter on your site:
301 Permanent Redirects
302 Temporary Redirect
403 Forbidden Messages
404 Error Pages
405 Method Not Allowed
500 Internal Server Error
502 Bad Gateway Error
503 Service Unavailable
504 Gateway Timeout
Accessibility
Server Performance
HTTP error codes beginning with "5" refer to server errors. For instance, a 503 Service Unavailable error means the server is unable to handle the request (usually due to being overloaded or down for scheduled maintenance) while a 502 Bad Gateway error means the server received an invalid response from the upstream server.
All of these server errors should be resolved according to technical SEO best practices as soon as possible. If not handled correctly, search bots won't be able to access your site.
HTTP Status
In the "Indexability" section above, we discussed how HTTP errors can prohibit search engines from efficiently indexing your site. These errors also pose an accessibility issue for search bots, preventing them from correctly accessing and understanding the content of your web pages.
Load Time and Page Size
Load time is a critical factor when it comes to technical SEO, including the accessibility of your web pages for search engines. If your pages take too long to load, bots may only crawl part of the content. Worse yet, they may encounter a server error that blocks them from crawling the web page at all.
Page size plays a major role in web page load time; the bigger a page is (usually due to large images and other file sizes), the longer it will take to load. Using CDNs, implementing lazy loading, and auditing 3rd party scripts are just a few things you can do to optimize load time.
JavaScript Rendering
Search engines often struggle to process JavaScript files, which are used to add interactivity to websites. There are a number of options when it comes to rendering JavaScript to optimize SEO, but in general, Google recommends utilizing pre-rendered setups.
Orphan Pages
Pages without any internal links leading to them are called orphan pages — and they are not good for SEO. Why? When a web page has no internal links pointing to it, that means users can't access that page without knowing the URL and entering it directly. The same is true for search bots, meaning orphan pages are rarely accessed or indexed by search engines.
Page Depth
Page depth refers to how many clicks it takes to a reach a given web page from the home page. The home page itself can be assigned a page depth value of 0, meaning a page linked from the home page will have a value of 1. Subsequent pages linked from there will have a page depth value of 2, and so on.
For SEO purposes, important pages should have a low page depth value, ideally 5 or less. If you have a large ecommerce site with many category and product pages, it's inevitable that some pages will be many layers deep; ultimately, having a well-organized site architecture that keeps important pages as "shallow" as possible is the best thing you can do.
Redirect Chains
Redirect chains happen when there are multiple redirects between an initial URL and the final URL. For instance, if URL X redirects to URL Y which redirects to URL Z, you have a redirect chain happening.
What this means for URL Z is that the page load time will be slower, link equity will have been lost along the way, and it will take longer for search bots to find and crawl that web page. For this reason, it's recommended to keep redirects to a minimum and avoid forming chains.
Rankability
Internal and External Linking
Link building strategies are important for helping search bots crawl, access, and index your web pages — and ultimately, these strategies impact each page's rankability, too. Internal links are important for helping bots understand the content and greater context of each web page.
If you have important pages that you want to be indexed and to rank well, linking to them from the home page and other authoritative pages on your site is key. While you're at it, it's also important to regularly perform a site audit to locate and correct broken links.
Build Quality Backlinks
Unlike internal links (which link from one of your web pages to another), backlinks link from other sites back to your own. Think of backlinks as a recommendation from other websites, telling search engines that your site is valuable.
Of course, the quality of the sites linking to your own site does matter. The goal should never be to amass as many backlinks as possible, but rather to build high quality backlinks from authoritative, trustworthy sites. Ultimately, the best way to do this is by generating high quality content of your own, and getting noticed by others in your industry. It's a strategy that takes work, but it's worth the effort.
Content ClusterING
Content clusters are a method of structuring your site's content where one page functions as the primary page for a given topic, with multiple pages linking back to it. This is also referred to as a hub and spoke content strategy.
Content clusters signal to search engine crawlers that the "hub" page is an authority on that topic, which may help it rank higher for relevant keywords.
Clickability
Implement Structured Data (Schema Markup)
Structured data is just what it sounds like: it is data that is structured in a certain way (known as schema markup) to help search engines better understand and index your web pages — and hopefully, rank them more favorably. It also helps to return more accurate and relevant results to users on the search results pages.
Optimize for Featured Snippets
While searching for various keywords on Google, you have likely noticed answer boxes that appear at the top of the SERP, above the first organic result (but below the paid ads). These are known as featured snippets, and they are a coveted position for online marketers because they can help you earn significantly more clicks.
Featured snippets can take several different forms: lists, paragraphs, rich answers, tables, accordions (with additional information that can be expanded), videos, and tools (such as calculators). The best way to earn a featured snipped spot is by creating quality content that targets question-based queries, succinctly answers the question, and is structured well.
Get Placed within Google Discover
Google Discover is a queryless "search" experience for mobile users that shows them content based on their interests. Discover suggests new content based on users' activity within the app and through their Google searches.
Google Search Console has a specific tab for monitoring your site's Discover performance. Again, there are no shortcuts here; the best way to earn placements within Google Discover is by creating high quality content, implementing structured data, optimizing your URL structure, and optimizing mobile and desktop versions of your site. You can analyze your performance in Google Search Console to identify opportunities for improvement.
More Technical SEO Guides
This collection of technical SEO 101 guides covers all the basics that should be part of your ecommerce SEO strategy: schema markup, URL canonicalization, redirects, site maps, noindex tags, and other optimizations that help search engines crawl and index your site more effectively.
For the best chances of improving your organic search rankings, we recommend partnering with a technical SEO agency like Whitecap to handle all of your ecomm SEO needs.
Contact the Technical SEO experts at Whitecap SEO
Want to improve your search engine results rankings, generate more traffic using targeted keywords, boost your site speed and other technical SEO elements, and ultimately grow your online sales?
Contact our team of experts today to request more information about our Technical SEO services and to learn what sets Whitecap apart from other SEO agencies.