The most important aspect of search engine optimization (SEO) resides in your site’s technical infrastructure, which controls search engines’ ability to crawl your site and serves the web content that needs to be indexed. The health of your technical SEO determines whether your site is open for business or closed for the season.
For example, if your site is invisible, too visible, unsecure, or too slow, search engines could pass your site by when they choose which sites to rank in the organic search results. To get to the bottom of those infrastructure issues, technical SEO exposes the causes of poor rankings and offers recommendations for how to resolve them. Since most of these issues aren’t visible, they tend to fly under the radar until someone with technical SEO expertise ferrets them out.
Technical SEO issues can be categorized into eight different buckets, as follows.
1. Crawlability and Site Structure
This is where organic search begins: When search engine bots visit your site, can they discover the text and crawl the links effectively?
If a bot encounters a block like a disallow instruction or a piece of code it can’t crawl, the content linked to on your site from that page will not be accessible. For example, if an ecommerce site links to its products using link code that isn’t crawled by bots, the product pages won’t be crawled.
That ability to crawl gates all organic search performance. No crawl means no indexation, which in turn means no rankings and no organic search traffic.
2. Indexation and Duplicate Content
When bots crawl your site, the textual content on each page needs to be indexable.
Some forms of code and media files render content on a crawled page invisible to search engines. For example, visitors can read the words inside of a heading embedded in an image, but web search engines aren’t using text recognition yet. They can’t detect the words embedded in the image. The relevance of that content – one of the most important ranking signals – is lost because the words aren’t indexable.
In other cases, far too much content is indexed because the site’s platform generates multiple pages of the same content with multiple URLs. Search engines then have to sort through the mess to determine which pages are important and which are just duplicate content.
These two scenarios are common, but there are many different ways to accidentally block your site’s crawlability or indexability.
3. Server Header Status and Errors
Every time a user agent – such as your browser or a search bot – requests a web page from a server, it returns a header response code. The 404 “file not found” error code you encounter when you visit a dead URL is the most well-known of these response codes.
Search engines use these response codes to determine whether to index content and which pages to rank. Server response codes prompt specific behavior from the search engines. The 404 code, for example, prompts the engines to remove the URL from its index.
4. Mobile First Indexing and Mobile Friendliness
More than half of the searches Google sees worldwide happen on a mobile device. As a result, Google incentivizes sites to provide a positive mobile experience to searchers by giving a ranking boost to mobile-friendly sites.
In addition, Google is in the process of transitioning its global index to catalog mobile versions of pages rather than its traditional desktop preference. That means that many sites’ rankings are judged based on their mobile versions, which often remove some elements of the navigation, content or entire sections of the site, causing crawl and indexation issues.
5. Site Speed
Your site’s load speed, another aspect of user experience, also impacts your rankings. Pages that load quickly receive a Google rankings boost.
Since pages using different templates could have very different load times, the boost is applied page by page to individual keyword rankings. For instance, your product pages might load lightning-fast, while your store finder could be extremely slow.
6. HTTPS Site Security
Security is another Google initiative trying to force the web into modern times. Sites hosted securely with HTTPS receive a rankings boost. It’s important to note that all elements of the page need to be hosted securely, from the images to the third-party tracking code.
7. International SEO
Sites that operate in more than one country also have some technical SEO challenges to navigate. Google and Bing require competing methods of annotating each page’s targeting. Google uses hreflang tags, and Bing uses a simpler content-language meta tag.
8. Schema Implementation
Technical SEO also recommends schema implementation to enable Google’s rich snippets and other search results page features. Inserting specific code around specific types of information facilitates machine readability and understanding, which in turn allows Google to feel more confident about surfacing that information in its search results.
For instance, when you search for a recipe and the ingredients, picture, or cooking time display, that’s structured data at work. The sites with the rich snippets place structured data in the page template around the visible information on the page.
The good news is that all technical SEO issues can be resolved. Nearly all require the assistance of a developer, and some will require a lot of resources. Time spent on technical SEO, however, tends to have a larger effect on organic search performance because it typically affects more than one page at a time.
For example, finding and removing a block in your site’s crawl path that’s cutting off organic search rankings for an entire section of the site will dramatically increase SEO performance. Understanding where the technical SEO challenges are and which truly impact your organic search performance will help you prioritize your resources for the largest return on investment.