What is technical SEO?
Technical SEO is the foundation of a website. We learn a lot about the importance of content for good rankings in the SERPs, but content alone is not enough without a solid foundation.
Technical SEO is the process of optimizing the infrastructure of a website from the start to enable the search engines to crawl and index it properly. Technical SEO doesn’t deal directly with the content of the site but with its setup and elements like XML sitemaps, structured data, page speed, URL structure, navigation, robots.txt, and so on. In the following, we’ll look at each of these essential technical SEO elements and why they are essential for any website.
Preferred domain URL: www or no www?
Using www at the beginning of your preferred domain URL is no longer a norm. In fact, according to many SEO experts, you don’t need it anymore, unless without “www” your users will not recognize your domain name as a domain name. There are no SEO benefits in spelling your domain with www or without. Since choosing www.example.com or example.com is a matter of personal preference, the only thing you must consider is that this choice will likely affect the lifetime of your site. Here are the best practices:
- Let Google know from the start your preferred domain (or canonical domain) URL because Google will not differentiate between the two after that. See Google support:“Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead. Also, we’ll take your preference into account when displaying the URLs. If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages.”
- Verify ownership of both the www and non-www versions of the canonical domain
- Use 301 redirects to redirect traffic from your non-preferred domain.
- Be consistent in your link-building strategies: if you prefer https://example.com, make sure all your links follow the same format.
- Set canonical URLs for your entire site. It should look like this:
<link rel=”canonical” href=”https://example.com/” />
To check if your site has the canonical URL set up correctly, right-click anywhere on your website page to open “View Source” and search for the word “canonical.” If you don’t find it, this is a technical SEO issue, and you need an expert to fix it. The correct setup of the canonical URL helps to prevent duplicate content issues.
- When you migrate your site from http to https you want to make sure you respect all the best practices mentioned above and verify both https://example.com and https://www.example.com in the Google search console to let the search engine know your preferred domain URL again.
URL structure is a critical technical SEO element
One of the most critical elements of technical SEO for your website is the URL structure. After you decide your preferred canonical domain, it is logical to plan your “site tree” – a step that defines the URL structure of your website.
Optimizing the URL structure of your site is equally important for your users/readers (usability) as it is for the search engines (findability).
For the users, the URL structure solves a user experience issue, allowing them to follow the logical path to what they are looking for.
Where SEO is concerned, a proper URL structure helps search engines understand the site. The site tree also helps you to avoid competing with your own pages and content. Here are some of the technical SEO best practices that apply to URL structure:
- Avoid competing with your own Tell Google which pages are more important for the SERPs.
- Use a straightforward URL format such as https://example.com/category/subcategory/keyword-phrase.html (where “html” is not mandatory, only depending on your CMS)
- Use your main keyword phrase to make your URL “semantically accurate.” The keyword phrase in your URL, although not a major SEO signal, indicates the topic of your page to the search engines. Users read it too, which is an added boost for UX. Plus, according to MOZ:
“Unoptimized, semantically inaccurate URLs can look unwieldy, and instead of garnering clicks they actually deter them.”
- Use words in URLs as much as you can but don’t fall for keyword stuffing. Although URLs can include ID numbers and referral codes, people understand words.
- Short is memorable: try to make your URLs as short as possible so remove stop words like “of, for, to, and,” etc.
- Use hyphens to separate keywords. Google prefers hyphens too and recommends that you keep your URLs simple:
“We recommend that you use hyphens (-) instead of underscores (_) in your URLs.”
- Don’t use ALL CAPS. That’s never a good idea. It can be perceived as a spammy technique by the search engines. Always use lowercase characters.
- Check your canonical URLs: just like you make sure that your preferred domain has a canonical URL, your entire site should. Every page of your website must include the tag <link rel=” canonical” href=https://example.com/yourpageurl> in the <head> part of the HTML for pages and even for blog posts. Canonical URLs tell Google and other search engines which the most important version of a “duplicate” page is. In other words, canonical URLs help you consolidate duplicate content that is not created to spam. Also, as a rule of thumb, you should specify canonical URLs for all the pages of your website.
Website navigation is essential for technical SEO too
The best time to plan your website navigation is when you plan the entire structure of your website. Website navigation refers to the internal link structure of your website, and it’s usually starting from the homepage, following a logical structure that helps the users navigate back and forth. But search engines use website navigation too to discover and index new pages.
Navigation depends on links – or the internal link architecture of your site. This is the “table of contents” of your website. Proper website architecture is the backbone of good XML sitemaps and breadcrumbs – both equally important for technical SEO.
Website navigation matters because:
- Users like to find what they need fast, preferably without using the “search box” on your website.
- Search engines can understand and index your website better when you implement proper navigation.
- When you optimize your navigation, don’t hide archive pages and don’t use just one category for everything. It adds clutter and confuses both users and search engines. As Google suggests, plan your navigation based on your homepage.
“The navigation of a website is important in helping visitors quickly find the content they want. It can also help search engines understand what content the webmaster thinks is important. Although Google’s search results are provided at a page level, Google also likes to have a sense of what role a page plays in the bigger picture of the site.”
- Breadcrumb navigation helps too: users navigate your website easier when they can follow a logical structure and search engines recommend the use of breadcrumbs too. Again, Google’s recommendations here should serve as the main guidelines:
“A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, left-most link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup when showing breadcrumbs.”
XML sitemaps for technical SEO
XML sitemaps count among the best tools to help search engines index your site. They are defined as “roadmaps” of your site.
Using XML sitemaps, you give search engines a structure of your site starting from the most important pages to the least, also letting them know how often you refresh the content on specific pages. XML sitemaps usually feature URLs, dates, language, and other parameters that signal search engines important data. For example, “when a date in the XML sitemap changes, Google knows that there is new content to crawl and index.” Best practices include:
- Respect the official Google guidelines.
- Submit your XML sitemap to Google in the Google Search Console.
- Include only canonical URLs.
- Do not include “noindex” URLs.
- Look for and address warnings and errors. For example, you may find in your Google search console a message like “When we tested a sample of the URLs from your Sitemap, we found that some URLs were not accessible to Googlebot due to an HTTP status error. All accessible URLs will still be submitted,” or “Sitemap contains URLs which are blocked by robots.txt.” Sometimes you block robots.txt on purpose, but most likely these are technical SEO errors. Fix these issues quickly.
Please note that a sitemap should contain only the logical, well-structured links to every page of your website you want to be listed in the SERPs. But keep in mind that just because you do not list a page on your XML sitemap, it doesn’t necessarily mean that the search engines will not index it. A sitemap is just a (very important) tool to help search engines make sense of your website’s structure and not a secret recipe to top-SERP results.
Robots.txt: the technical SEO tool you need right now
Robots.txt is a technical SEO tool. Also known as the robots exclusion protocol, it’s one of the standard tools that show search engines what to crawl: pages, categories, archives, images, and so on. Another important function of the robots.txt file is to show search engine bots what not to crawl. It plays a critical role in the crawl health of a website.
When using a robots.txt file, remember to adhere to Google’s best practice recommendations:
- Do not use robots.txt to hide your web pages from Google Search results.
- Use them to “control crawling traffic, typically because you don’t want your server to be overwhelmed by Google’s crawler or to waste crawl budget crawling unimportant or similar pages on your site.”
- Use to prevent images from appearing in Google search results.
- Use it to prevent the bots from crawling resource files like scripts or style files. Password-protect all files you do not want to be accessed by search engines of any kind.
The robots.txt file can make or break your website. Although there are enough resources that explain how to build this technical SEO file correctly, including some offered by Google, you are better off to leave this job to a professional. Or, for smaller sites, you can use a tool like the Robots.txt File Generator by Aaron Wall.
You can also use Robots Meta Directives to further refine the way search engine bots crawl and index your site. Also known as “meta tags,” these directives usually look like this: <META name=”ROBOTS” content=”NOINDEX, NOFOLLOW”> and are implemented in the “head” part of the html of a web page. Google offers detailed information about implementing Robots Meta Directives here. Alternatively, see the details provided by the World Wide Web Consortium (W3C) here.
Page speed matters
Page speed is another essential technical SEO element to address to rank above the competition. Google considers this a ranking signal and will implement an algorithm designed to de-rank pages that load slowly for mobile users starting in July 2018.
Studies found that “53% of mobile site visitors leave a page that takes longer than three seconds to load.” Because on mobile speed matters, Google recommends:
“Marketers must keep people engaged on mobile and focus on building mobile-first experiences.”
Here’s why page speed matters:
- Because it’s a ranking factor in the Google SERPs. Officially.
- According to think with Google, “as page load time goes from one second to 10 seconds, the probability of a mobile site visitor bouncing increases 123%. Similarly, as the number of elements—text, titles, images—on a page goes from 400 to 6,000, the probability of conversion drops 95%.”
- It’s known since 2009, according to studies by Akamai, that almost 50% of web users expect a web page to load in two seconds or less and that over 40% will leave if it takes more than three seconds to load.
- A one second delay in page-load can cause 7% loss in customer conversions (eConsultancy)
In addition, consider Google’s Accelerated Mobile Pages for better flexibility and results.
Structured data (Schema.org) technical SEO
In 2011, four of the major search engines of the world - Google, Microsoft, Yahoo!, and Yandex – agreed to use structured data to make it easier for webmasters and developers to communicate with search engine bots. According to the official website: “On-page markup helps search engines understand the information on web pages and provide richer search results.” Basically, you need to implement structured data markup because search engines look at it when they need to improve search engine results. There’s a lot to understand about this aspect of technical SEO, and hoteliers can find markup for hotels here. Again, the DIY (do it yourself) approach is not recommended for structured data.
Mobile-first is a must
If responsive design was enough last year, 2018 marks the beginning of mobile-first as the paramount standard of technical SEO. As our research demonstrates, mobile-first implies excellent UX, page speed, and the use of HTTPS among other details.
Here are the things you should do to have a mobile-first site:
- Adhere to Google’s best practices for mobile-first indexing.
- Implement mobile SEO configurations from Google here.
- Do you pass the Google mobile-friendly test?
- Check your Search Console – Google will notify you when “Mobile-first” verification is available for your site. Verify your site.
- Your mobile-friendly site must be fast – see the page-speed arguments above.
- Use of AMP (Accelerated Mobile Pages) doesn’t replace the need for speed.
- Avoid using popups, interstitials, and other intrusive elements.
- Site security matters too: upgrade your site to HTTPS now. Beginning in July 2018 with the release of Chrome 68, Chrome will mark all HTTP sites as “not secure.”
To conclude, technical SEO is not an easy task, but it is essential to ensure that your carefully crafted content ranks well in the SERPs. One cannot work without the other. It’s not enough to write good copy to overturn your competition in the search engine results. Keep your site free of errors – like 404 “not found,” which usually occur when users follow a broken link or incorrect redirect. Many search engine experts warn that such errors will tank your rankings. Use proper 301 redirects and fix your broken links too. There’s a lot more to explore, but this primer should clarify just why you can never stop the SEO scrutiny of your website. Whether you like it or not, technical SEO is the most crucial aspect of search engine optimization, followed by on-page SEO, which deals with all forms of content, and off-page SEO, which deals with link-building strategies.