HeroGraphic

Revealed: 6 most likely gaps in your enterprise SEO strategy

missing piece

In-house SEOs face daily workloads which can be challenging and unpredictable. It’s easy to spend too much time looking at the big picture, then missing smaller details – or getting so entrenched in the minutiae that you lose track of the overall direction you’re headed in. In essence, it’s easy to find yourself operating with an enterprise SEO strategy that has gaps in it. 

This is where we come in. We’ve compiled the 6 most overlooked areas we’ve encountered while working with clients. If you suspect you’re guilty of any, don’t worry – we will lay out the information you need to fix it.

1. Incongruent international SEO 

When you’re working with a site that spans multiple international search markets, it’s easy to lose sight of which strategy is best for the brand. What’s more, you may end up operating multiple incongruent strategies without realising, after multiple internal stakeholders have worked on the site. 

What you’re left with is your large website, or different top-level domains, competing for the same traffic. If you suspect your international strategy is incongruent, now is the time to fix it by applying one consistent approach for all territories. 

The main approaches to choose from are: 

  • Multiple Localised Country Code Top Level Domains (ccTLDs), with or without HREFLANG 
  • Multiple generic TLDs with geo-targeting in Search Console, with or without HREFLANG 
  • Localised folders on one generic TLD, with or without geo-targeting in Search Console and with or without HREFLANG either implemented in the code of each page, or through an XML sitemap 
  • Localised subdomains on one generic TLD along with geo-targeting and/or HREFLANG either implemented in the code of each page, or through an XML sitemap 

Mixing the approaches above can confuse search engines, and lead to all sorts of overlaps and gaps in visibility. Often this comes from different stakeholders overseeing different territories. 

Ultimately, not having a joined up approach means that people searching have to work harder to find the right version of the site for them from a geographic or language perspective. You’ll also find that you’re competing with yourself, and that all your international sites rank badly as a result, so it really is a lose-lose situation all round.

2. 404 errors spiralling out of control 

When you have an enterprise site, it is easy for 404 errors to occur at scale. This issue often comes about when products or content resources are retired on a regular basis due to stock, seasonality or changes in focus.  

Left to get out of control, excess 404 errors send clear signals to search engines that your website isn’t properly curated. You will also be wasting crawl budget and plunging link juice into a black hole.

Rather than dealing with these errors on an ad-hoc basis, the trick is in designing strategies to better deal with retiring content or product pages and ensuring your team implements them as best practice. 

The most common strategy is automating 301 (permanent) or 302 (temporary) redirects where appropriate. 

3. Broken links draining crawl budget & link juice 

Broken links in general – not just 404 errors – are easy to lose track of with enterprise sites. However, applying redirects to broken links across the board isn’t an efficient strategy, as it forces search engines to work harder and spend more crawl budget to visit fewer pages. 

The ideal solution, if you’re dealing with broken links at scale, is to automate the task so that if a URL is redirected, the internal links are updated to reflect the destination of the linked-to URL. At the very least, you should be running regular internal link audits to identify issues quickly.  

The benefits of proactively fixing broken links are a consistently good user experience, less strain on your server and the prevention of internal redirect chains. If you suspect your site has such redirect chains, you should fix it quickly – not only do these issues waste crawl budget, but they also kill the link equity passing through those links.

4. Creating a large volume of poor-quality content 

Industry catchphrases like “content is king” have such prevalence that it is common for in-house SEOs to be under pressure to commission vast amounts of written content. 

However, producing content at scale often means that the quality and diversity of content is lost. This issue has only worsened with the emergence of generative AI and an increasing tendency to rely on such tools to create content. 

Your site will lose ranking over time if it houses thousands of poor-quality pages. Search engines will recognise that you’re not providing value for the reader and avoid placing you at the top of the SERPS. Instead, it’s better to have a smaller volume of high-quality content that is unique and adds value. 

5. Relying on just one form of content 

Humans have different learning styles, so only producing written content means you’re cutting out a significant portion of your target market. Search engines recognise the value of varied content types, and reward sites which leverage mixed and inclusive media with better rankings. 

One best practice to consider moving forwards is to think about how every piece of written content can be repurposed in another form to maximise reach and impact. Turning the written word into images, video, audio and interactive content are all great ways to maximise retention, dwell time and conversions.

6. Neglecting Core Web Vitals – LCP, CLS & FID 

You might have spent months working with developers to shave milliseconds off your page load times, only to find that rankings just didn’t improve.  

Page load speed is still very much a ranking factor, but the way search engines are looking at it has been a little different since the introduction of Core Web Vitals by Google in 2021.  

There are three key areas that Google focuses on, and prioritising these for your enterprise site will result in the positive impact you’re looking for. 

Google looks at: 

  • Largest Contentful Paint (LCP) 
  • Cumulative Layout Shift (CLS) 
  • First Input Delay (FID) 

Each area is scored Good, Needs Improvement, or Poor by URL grouping in the Core Web Vitals Search Console report. The benchmarks could look as follows:

Good Needs Improvement Poor
LCP <=2500ms  <=4000ms  >4000ms 
FID <=100ms  <=300ms  >300ms 
CLS <=0.1  <=0.25  >0.25 

 

Google cites impressive gains experienced by sites that worked to improve these areas. 

Time To Interactive (TTI) is also a key consideration which we investigate when working with our enterprise clients. Unlike FID, which looks at the first point at which a user may interact with a webpage, TTI is a measure of how long it takes for a page to become fully interactive.  

While TTI isn’t explicitly measured by Google as part of the Core Web Vitals programme, we’ve certainly noticed a positive impact after improving this metric for our clients 

Not sure where your gaps are? 

Each month, Skittle Digital offers a limited number of Free Acquisitions Workshops which analyse your site from all angles. For a detailed snapshot of the different aspects of enterprise SEO you might need to focus your attention on, book yours today.

AUTHOR

Imogen Groome

Content Lead

Imogen is the SEO Content Lead at Skittle Digital. Imogen has worked in SEO since 2016. She helped to define SEO strategy at Metro.co.uk before guiding the newsroom at The Sun Online as SEO Editor. She has more than 5 years’ experience in scaling content strategies that drive revenue for brands through organic search channels. In her spare time, Imogen writes books, watches poor-quality reality TV and hangs out with her cats.

Similar posts.