IMPORTANT INFORMATION: We're launching a new Knowledge Base and transitioning to a new Support Portal on 5th February.
Learn MoreTechnical SEO: Sure up Your Recruitment Website’s Foundations
29 Jan, 20255 minsTechnical SEO is a foundational part of optimising websites for search engines. Fundamentally, pages should be crawlable and indexable (without errors) for the best chance to rank.
Elements of a standard Tech SEO strategy include looking at:
|
|
Some activities have a greater impact than others.
Content, both on-page and off-page, has the greatest impact when published on a site with good technical SEO – having an amplifying effect on search engine visibility.
What is technical SEO?
Technical SEO is search engine optimisation that focuses on ensuring your website’s pages can be crawled, understood, and indexed well, with the ultimate aim of increasing visibility and rankings.
What does crawling a website mean?
Like a spider crawls a web, search engine bots crawl websites. Googlebot is the most well-known, which crawls to index pages on Google.
Search engine bots like Googlebot read content from pages including specialisms, meet the team consultant profile pages, job pages and media hub pages – and uses the links within them to find more pages. There are a number of ways you can control what gets crawled on your site.
One of the main ways to signpost to search engines where they can and can’t go on your website is through a robots.txt file. A reference to your sitemap location should be found here too, which all our recruitment sites do as part of their pre-go-live tech checks at set-up.
However, it’s important to note that Google crawls a site at its discretion, and may still index a page that is being disallowed in your robots.txt file if links are pointing to the pages.
A stricter way of controlling the behaviour of crawl bots is at server level (for example, a .htaccess file). This is an advanced process.
Many crawlers support crawl-delay directives in the robots.txt file, which allows you to set how frequently they can crawl pages. Google doesn’t respect this, though you can file a special request to reduce the crawl rate in Search Console.
Access restrictions also affect your site’s crawlability. Whether it’s a login system, HTTP authentication or IP whitelisting, you can allow a group of users to access certain pages, while search engines won’t be able to and won’t index them. Examples include 401 status codes, which have been applied to dashboard pages on our recruitment sites.
You can see Google’s crawl activity for your site in the ‘Crawl stats’ report in Search Console. This will show you more information on how and what Google is crawling.
A more advanced form of analysing the crawl activity on your website can be done through accessing server logs and using data analysis tools.
After pages are crawled, they’re rendered and then sent to the index. The index is a master list of pages, like a database, that can be returned as results from search queries.
How does indexing work?
There are various ways to signpost to Google and other search engines that you’d like a page to be indexed.
Robots directives enable guidance to search engines on how to crawl or index a particular page. The robots meta tag is a HTML snippet added to the <head> section of a page.
By default, if a <meta name=”robots” content=”noindex” /> is omitted, a crawler will attempt to index. The use of the robots nofollow attribute will tell search engines not to follow links on the page as well. A full list of Robots Meta Tags Specifications can be found here.
Canonicalisation is the signposting to search engines that one specific URL should be shown in their search results. Where there can be multiple versions of the same type of page, for example job search and blog post search results pages, canonicalisation mostly helps.
Individually unique pages having a rel=canonical tag help signal to search engines what URL to index. In instances where duplicate content or cannibalisation might be an issue, canonicals are useful.
Following internal links and redirects, as well as reading the sitemap (as found in the robots.txt file being best practice), all provide ways for URLs to be found and indexed too.
Use the URL inspection tool in Google Search Console to see how Google has indexed a given page. Here you can also request a page be submitted for indexing.
Technical SEO best practices for recruitment websites
Some technical SEO best practices will influence your rankings and organic search traffic more than others. Let’s look at just a handful of quick wins we employ when carrying out our tech checks.
A relevant, keyword research-led H1 should be placed at the top of a page, and this should closely match the page title. Semantically similar terms should be used in H2s and H3s.
Look out for noindex pages especially, and make sure that these are not unintentionally set for a page you’d like to be crawled and indexed.
In the case of a website migration, maintaining high domain and page authority are important. When a website’s page changes its URL, it’s key to keep the authority already built up to the existing page for the new page. This is where redirects come in to retain link value.
Internal linking is a powerful exercise in building relevancy for specific pages across your site. They should not be thought of as just helping your pages be found. For example, if in an industry-topical blog post you reference a recruitment specialism, internally linking to the relevant page using keyword-led anchor text will help build content relevancy to it.
Internal anchor text optimisation can be made easier using third-party SEO tools that recommend internal links to be added based on a scan of your current content and URL structure. Custom searches across a website’s content can help find desirable anchors.
Schema mark-up (a type of structured data) offers the opportunity for your pages to appear in Google’s Rich Results. This additional code helps search engines to better understand your content, context and how entities are connected.
Whether it’s an article, podcast or job posting you’d like distinct visibility of in the SERPs (Search Engine Results Pages), read Google’s search gallery on the schema needed for your site. Detailed schema mark-up for these content types and more come as standard with Venn.
To get maximum visibility for job postings in Google for Jobs results, our recruitment sites are built with in-depth jobPosting Schema.org mark-up, including:
Job title
Description
Employment type
Direct apply
Salary
Location
Remote working conditions (if applicable)
Your recruitment organisation and logo structured data fields
Further technical SEO projects
While these further technical SEO projects may not be as beneficial as some of the best practices already mentioned, it’s still good practice to spend time on them.
Page experience signals are a set of ranking factors that you should look at to ensure users of your site are getting an optimal user experience.
Google’s Page Experience factors include:
HTTPS
Mobile friendliness
No intrusive interstitials (pop-ups)
Core Web Vitals
Core Web Vitals are most significant, measuring page speed and user experience. The metrics are Largest Contentful Paint (LCP), Cumulative Layout Shift (CLS), and Interaction to Next Paint (INP) – the latter being the most recent metric released in March 2024.
How is technical SEO’s success measured?
We measure the success of our technical SEO efforts by clients’ health scores. A website health score is a measure of how SEO-friendly a site is, considering over 150 different factors, and reflects the number of URLs on the site without errors.
Health scores play an important part in any tech SEO strategy by being a guide on what to improve to make your site more search-engine friendly.
This month’s latest analysis shows that Venn recruiter sites average a health score of 75%, against an average of 57% before Venn. When our Tech SEO specialists work on the continual optimisation of these sites, the score goes up to 85%, highlighting the advantage of choosing a marketing package.
But how does an excellent health score translate into value for you as a recruitment marketer?
Sound technical SEO foundations needed
The health score comes from an industry-leading SEO site audit tool, and reflects sound technical SEO foundations in a website. On which, optimised content is built.
In our experience, technical SEO is between 20 and 40% of SEO strategy, depending on the website size. Technical SEO can have a greater impact on larger sites with more URLs. Content, both on-page and off-page, accounts for the remainder.
If a health score is 80%+ (the target for marketing package clients), then optimised content and sometimes linkbuilding is required to rank for non-brand terms. Technical SEO alone will only take you so far in achieving your marketing goals.
While there isn't strictly a 'health score went up by X, organic search traffic went up by Y' metric, maintaining - and improving - the health score sures up the foundation of a site.
It's the sum of technical SEO working with content SEO (and laser-targeted keyword research from strategy) that provides better organic search rankings, clicks and – most importantly – candidate and client conversions.
Searching for Tech SEO support for your website?
If you believe your site would benefit from stronger technical SEO foundations or would like to learn more about our content optimisation and strategies to help attract more targeted organic search visitors, get in touch with one of our SEO specialists today.