The most common technical SEO issues you should avoid

Written by
Yevheniia Khromova
Reviewed by
Valerii Khomenko
Nov 22, 2022
21 min read

Wise people learn from other people’s SEO mistakes. And believe it or not, this learning process doesn’t have to be complicated. 

To make it easier to figure out the top SEO mistakes made by many sites, we’ve scanned over 40,000 websites with our Website Audit tool. We divided the issues we detected into 10 categories and described them by their frequency of occurrence and severity. 

You’ll find an additional section at the end where we talk about less critical but still frequently occurring technical SEO issues that also deserve your attention. So we strongly encourage you to read to the end.

By the way, you can check any website’s SEO and see how well-optimized it is for both search engines and people using our SEO Analyzer.

Now, without further ado, let’s dive into the biggest technical SEO mistakes we discovered during our analysis.

The most common technical SEO issues

Problems with images

Our list of common SEO mistakes starts with images. They make a website more aesthetically pleasing and help users perceive the content better. Plain text without a single image that would visualize information is unlikely to grab the reader’s attention. 

We definitely recommend using them (but carefully) because unoptimized images can lead to website speed issues, affecting your SEO and UX.

Our research has shown two main image issues: missing alt tags and overly sized image files. Make sure to check your site to see whether you have them, too.

LAUNCH WEBSITE AUDIT
Check how well your site is optimized, including the images it contains, and get tips on how to fix errors

Missing alt text

Missing Alt tags

The most common SEO issues related to images were the missing alt tag in images, with 83.87% of websites having this issue.

The alt text (alternative text) is an HTML element designed to explain the meaning of images to search robots and make them more accessible to users. It’s one of the main elements of image optimization that helps search engines better understand what your picture is about and rank it. 

The alt attributes for images also come in handy for users when (for whatever reason) the picture doesn’t load up. They’re also useful for people with impaired vision who have trouble perceiving visual information.  

If you want to make the most out of your images, add helpful, informative, and contextual alt text to each of them. You can also use keywords but avoid keyword stuffing as it causes a negative experience for your users and signals spam to search bots.

Too big image

Image size issues

Another problem with images is that they’re often too resource-heavy. Our inspection showed that this issue exists on 35.44% of sites. 

The image file size can affect how fast the page containing it loads. The chain of logic here is simple and obvious: the bigger the size, the longer it takes to load, the longer the user has to wait, the worse the user experience, and the lower the page’s position in search results.  

One way to reduce the image file size is to compress it while maintaining image quality. There’s no perfect level of compression because it depends on image format, dimension, and pixels, but you should still try to keep them under 100 KB or less whenever possible.

Meta tag issues

Meta tags are critical for SEO because they tell search engines and users important information about your web page. They also help search engines better understand how to present your pages in the SERPs. Meta tags are core elements of effective optimization. That’s why you wouldn’t want to have problems with them. But some websites do have them, so below is a list of the most common SEO mistakes in terms of meta tags.

Missing description

No meta description for pages

This issue occurs on 71.11% of websites. But why is it so critical, given that Google neither confirms nor refutes that meta descriptions affect rankings? 

The answer is that Google can still use them for search result snippets. If you don’t specify a page description, search engines will use the available content on that page to generate a description themselves. Are you really willing to rely on the search engine in this matter? Probably not. 

Duplicate page titles and descriptions 

Duplicate title and description tags

More than half of the analyzed websites have duplicate title tags (52.59%) and duplicate descriptions (50.17%). 

Multiple pages on your website with duplicate titles and descriptions confuse search engines because they can’t quickly determine which page is relevant to a particular search query. Such pages are less likely to get a good ranking so try to make titles and descriptions as unique as possible.

If you’re struggling with rephrasing them, you can use the AI Rewrite feature available in SE Ranking’s new Content Marketing Module. It’ll help you create unique copies in seconds and generally allows you to build SEO-friendly texts faster.

Title and description length issues

Title and description size issues

22.82% of websites have title tags that are too short, and 21.75% make their descriptions too long.

The issue is two-sided: 

  • A too-short title can’t fully describe your page.
  • A too-long description can be cut by the search engine in the snippet. 

The general recommendation is to keep your page titles between 40-60 characters and page descriptions under 155 characters.

Missing title tag

Missing title tag

A rare but still existing mistake (9.04%) is not including the title tag at all. This way, you lose an opportunity to tell search engines and users what your page is about. Instead, search engines will create the title using the available page content. And it may not always match your goals or your keywords.

Missing internal inbound links

Missing inbound links issues

It’s worrisome to see that more than half of the analyzed websites (63.09%) have pages with no links connecting them to other website pages.

This is a major issue because inbound internal links enable users and search bots to navigate your website. They structure the website content hierarchy and help distribute link juice. 

There’re some cases when you can have pages without internal links, like a landing page for a particular promo. They only exist during the period of the sale and can be accessed by the link posted on social media or sent in a newsletter. Other than that, it’s simply wrong from the SEO perspective to have isolated pages. Search engines won’t find them while scanning your website, and users won’t get there when browsing your site.

If you have isolated pages on your website, decide whether the page is valuable. If so, revise your website structure and figure out the best way to build links to this page from the other sections and pages of your website. By ‘the best way,’ we mean the way users and search engines can get to the page without applying extra effort.

Heading issues

HTML heading tags help people and search engines quickly understand page content. They outline the structure of the text and can even affect your page’s ranking. 

Our research has shown that websites often have technical SEO problems related to H1 and H2 tags, so let’s dive into the essence of these issues.

H1 tag issues

H1 errors

The H1 tag is the top-level page heading that briefly describes page content and helps the reader understand whether they’ll find what they’re looking for here. That’s why we start with it.

  • Missing H1 tag 

62.85% of websites have pages with no H1 tag. They lack the most important heading on the page that usually serves as the title for that piece of content (don’t mix it up with the title tag!). These pages then lose the chance to provide a better reading experience for users and a better text-scanning experience for search engines.  

  • Duplicate H1

56.6% of websites have pages with duplicated H1 tags. Like all duplicate content, non-unique H1 headings make it more challenging for search engines to figure out which site page to display in search results for a given query. They can also lead to several of your pages competing for the same keyword, making duplicated H1 tags one of the most insidious errors in SEO. 

  • Multiple H1 tags

50.25% of websites use multiple H1 tags on their pages. Given that Google’s John Mueller said that you could use H1 tags on a page as often as you want, you might be wondering what the problem is here.

Let’s look at it this way. If your page has multiple H1 headings, which one should the search engine give more weight to? You might think the first one, but you can’t be 100% sure that the search engine will follow your logic. They have theirs. What’s more, having too many H1 tags on a single page can look spammy if you use them to place keywords. 

H2 tag issues

H2 errors

One of the most common problems related to H2 tags is that 68.28% of websites miss them in their content. There’s no evidence that skipping the H2 tag has a direct negative effect on your rankings, but it definitely worsens the user’s and the search bot’s experience on the page. H2 tags help structure the content and make it easier to scan the page within seconds.

If you have missing H2 tags, add them to your pages and ensure they’re brief and clear about the block they’re referring to. You can have several H2 tags as long as your content is logically structured and they aren’t placed higher than H1 tags.

The main takeaway from the h-tags section is the following: Everything that makes your pages better for users makes a difference to Google.

Problems with CSS files

CSS files errors

CSS files make websites visually appealing and user-friendly. They also help Google understand how the page works on both desktop and mobile devices. But when they aren’t optimized correctly, your page speed can suffer. 

Here are the most critical issues related to CSS file optimization revealed by our research:

  • Not compressed CSS files 

Using uncompressed CSS files is one of the most widespread technical SEO problems, as proven by 60.53% of websites. Big CSS files slow down page loading, worsening user experience and negatively affecting search engine rankings. By compressing such files (i.e., by replacing duplicate lines in the code with pointers to the first original line), you reduce the load and the browser can render CSS faster. 

  • CSS is too big

SE Ranking shows this error if the CSS file size exceeds 150 KB. This issue occurs in 56.7% of cases. We mentioned above that having a CSS that’s too big directly affects page loading, so use compression or minification (more on this below) to eliminate the problem.

  • Not minified CSS files 

Minifying means deleting unnecessary lines, semicolons, white space, and comments from the source code. This reduces the size of CSS files. Still, 55.59% of websites use extended code. Minification alone won’t give you the perfect CSS file size reduction, but when combined with compression, you can get significant results. 

  • Not cached CSS files 

According to our inspection, the number of sites that don’t enable CSS file caching is 22.06%. Although caching doesn’t reduce the CSS file size, it reduces the load on the site server. If you store cached copies of your CSS files, the next time the user visits that page, the browser will serve them the saved copy instead of sending additional requests to the server. 

Problems with JavaScript files

Issues with JavaScript files

CSS and JavaScript are both responsible for the visual component of the site but in slightly different ways. The latter adds dynamic and interactive elements. The problems related to JavaScript files are almost the same, though. These dynamic elements are “heavy” and affect page loading.

The distribution of SEO errors in this category is the following:

  • Not minified JavaScript—59.42%
  • Not cached JavaScript—33.33%
  • Not compressed JavaScript—31.25%
  • Too many JavaScript files—19.87%

Compression and minification of JavaScript files have the same impact as with CSS files, but here you should strive for no more than 2MB of JavaScript per page.

Another error that we didn’t observe in the CSS category is using too many JavaScript files. We recommend using less than 30 of them on a single page. Since the browser sends a separate request for these files, loading the page can take longer. And you already know the consequences of this kind of problem—a page with poor UX, a higher bounce rate, and possibly a lower ranking.

4XX HTTP Errors

4XX HTTP issues

HTTP status codes are three-digit code messages that the server generates in response to a client’s request. The 200 code is what every page should return. It means that the request was successful. The 4XX status codes, though, are what you should avoid for the sake of SEO. 

Below are the most common 4XX HTTP errors that our analysis revealed. 

4XX HTTP status codes

Over a third of the analyzed websites (41.34%) have pages with a 4XX response. 

In most cases, a 4XX response appears for a 404 error, and having pages with the 404 error isn’t that terrible in itself. However, you should take care of every internal link to such pages because linking out to dead pages provides a poor user experience. In addition, it drains your crawl budget.

As a solution, you can delete links to pages with the 404 error or set up a 301 redirect. Or at least make your 404-page look cute to reduce user frustration.  

4XX images (Not Found)

Pages aren’t the only elements that can return 4XX HTTP errors. It also applies to images and files. We noticed that 21.2% of websites have broken images that can’t be loaded. Besides a negative impact on the user experience, search engines can’t index such images. All you can do is remove or replace them.

Website speed and performance issues

39.02% of websites have problems with page loading speed (it’s too slow). These websites are likely to notice negative behavioral signals. For example, users might go to other sites instead of waiting for the page to load fully.

Google has recently confirmed that it no longer uses any previous page speed estimation algorithms. Page load speed is now evaluated based only on Core Web Vitals. Below are the most common CWV-related issues we detected when analyzing website audits. 

Largest Contentful Paint

LCP issues

The Largest Contentful Paint (LCP) measures the main content loading speed: how fast the largest images, text blocks, videos, etc., become visible. LCP is measured in real-world conditions based on data from the Chromium browser and in a lab environment based on data from the Lighthouse report.

You should strive for the largest elements to be loaded in under 2.5 seconds. However, our research shows that for some websites, it’s still a goal rather than an achieved result.

  • 31.01% of websites have LCP higher than 2.5 seconds when measured in a lab environment.
  • 12.56% of websites have LCP higher than 2.5 seconds when measured in real-world conditions.

To improve performance, use preloading on pages with static content, optimize the top-of-page code, reduce image file sizes, and get rid of render-blocking JavaScript and CSS.

Cumulative Layout Shift

CLS issues

The Cumulative Layout Shift (CLS) measures how long it takes for your web page to become visually stable. In other words, it looks at how long it takes for all elements on the page to take their places if some new content, like an image, loads longer than other page elements. It’s also measured in real-world and lab environments. The goal here is to maintain a CLS of 0.1 or less. 

Our research shows that:

  • 30.59% of websites exceed this value in a lab environment.
  • 10.91% of websites exceed this value in real-world conditions. 

One possible solution is to use size attributes for images and videos. It’ll help you book space for them in the final layout rendering. 

XML sitemap issues

XML sitemap errors

An XML sitemap is a file with a list of a website’s URLs and information about how they are related to each other. It helps search engines spot every page on your website, even isolated ones or those without links from other sites. 

It plays a vital role in crawling, but still, some XML-related issues occur pretty often.

  • 35.87% of websites don’t include a link to their XML sitemap to the robots.txt file. But they should include one because it can help search engines figure out where your sitemap is located.
  • 14.44% of websites have their XML sitemaps missing. This makes it difficult for search engines to crawl your site properly. Without a sitemap, search bots can’t find and access isolated pages of the site.
  • 11.77% of websites have pages with the noindex meta tag in their XML sitemaps. This is confusing to search engines because your XML sitemap should only include URLs you want search engines to crawl and index. 

3.36% of websites have HTTP URLs in their XML sitemaps. This is a two-sided problem. You can find HTTP URLs in your sitemap if you’re still using the HTTP protocol. In this case, it’s better to switch to HTTPS and replace the old URLs with new ones. Sometimes you will find HTTP URLs in your sitemap even after switching to HTTPS. That’s most likely because you haven’t updated URLs in your sitemap.

Missing redirect between www and non-www

Missing WWW redirects

Some websites contain www in their addresses, and others don’t. If you belong to the latter group, you should set up redirects from www to non-www, and vice versa, if your main website version uses the www prefix

Our research shows that 13.31% of websites use both and don’t apply redirects. It means these websites have at least two versions of the same page. This can lead to duplicate content problems which, as you know, can negatively affect your SEO efforts and result in lower page rankings.

Other equally important technical SEO mistakes to avoid

Once you solve all the SEO errors from the previous 10 categories, your site will definitely say ‘thank you,’ and it will eventually reach the top SERP positions.

Solving the next (albeit less critical) SEO technical errors will help you really get there.

  • Blocked by noindex

The noindex directive added to a page’s <head> section prevents it from being indexed by search engines. The search bot will drop this page and won’t display it in the SERPs even if other sites link out to it. While it’s okay to block some pages from indexing (like pages with filtered results for ecommerce sites or pages with personal user data or checkout pages), it’s not okay if search engines ignore your important pages.

  • Blocked by robots.txt

Robots.txt provides crawling recommendations. If it blocks your page, search bots won’t get access to it and index it. It’s similar to how the noindex directive works, but there’s one caveat. If other pages or resources link out to the page blocked by robots.txt, it can still be indexed. Make sure it blocks only unnecessary parts of your site, not important pages.  

  • Canonical chain

A canonical tag tells search engines that a specific URL is the main version of a page. A canonical chain occurs when some page specifies a canonical URL, but the latter defines a different page as canonical. And the search engine may have a fair question: Which page is the canonical page after all? Although a canonical chain isn’t the king of SEO technical errors, it’s always better to be clear because search engines can follow it. 

  • Redirect chain

Hardly any site can operate without redirects, but if they aren’t optimized correctly, they can be detrimental to SEO. Redirect chains (i.e., redirects from one page to the second, third, fourth, and so on) can make your website difficult to index and can even slow it down. Remember that the longer the chain, the slower the target page will load.

  • 302, 303, 307 temporary redirects

Temporary redirects send your users and search bots to different pages because the ones they are looking for are unavailable now, but they will be soon. Such redirects won’t transfer the link juice of the old page to the new one. Given their temporary nature, it’s best to avoid using such redirects for extended periods of time.   

  • External links to 3XX

Your site may have links to other resources that redirect users to new pages (not the ones you initially linked out to). If this new page is thematically related to the old one, or the transition is just as logical, there’s no problem. But if the link leads visitors to a page that doesn’t contain the necessary information, it’s worth removing this redirect and reviewing the link logic.

  • External links to 4XX

In this case, your site has broken external links to other resources. When following a link, the user expects to see the information they need but instead lands on a non-existent page and gets a bad user experience.

  • 3ХХ HTTP status code

The 3XX status code means pages have redirects. And it’s okay to have such pages. But it’s not okay if there are too many of them. At SE Ranking, we recommend that their number on the site doesn’t exceed 10%. It’s even better to stay under 5%. If you have such a problem, consider removing some of the redirects.

  • Internal links to 3XX redirect pages

This problem is similar to external links to 3XX redirected pages, but here it all happens within your site. The negative effect for both cases is the same. If the page isn’t relevant, you give your user a worse UX, which can be a bad signal for search engines. Also, your pages may not work properly if the redirect isn’t configured properly, or the site may run slower if there are too many of them.

Summing up

Now you know all about the top technical SEO issues preventing other sites from performing at full capacity. You also have a clearer picture of what issues to look out for on your site. 

Remember that the biggest errors in SEO are always the ones you don’t know about. With this in mind, our final advice would be to audit your website regularly. This way, you’ll catch even minor health problems and prevent them from escalating into something more.

Meanwhile, tell us about the technical SEO errors you often encounter in the comments below. And if you need any fix tips for them, we’re happy to share our experiences and help in any way we can.

Subscribe to our blog!

Sign up for our newsletters and digests to get news, expert articles, and tips on SEO

Thank you!
You have been successfully subscribed to our blog!
Please check your email to confirm the subscription.