Technical SEO: 16 Common Issues That Will Tank Your Rankings

Technical SEO is the process of optimizing your website for search. It helps search bots discover, understand, and index your site.

It also helps you gain rank with popular keywords.

Unfortunately, it’s easy to neglect certain aspects of technical SEO. When you do that, your rank will suffer.

In this article, I’ll cover 10 technical SEO pitfalls that you should avoid.

  1. Upper-Case URLs

Yes, it makes a difference if your URL is upper-case or lower-case.

Why is it a problem if you’re using upper-case URLs? Because some web servers won’t always redirect to the correct page when sending visitors to an upper-case URL. In other words, it won’t convert the lower-case characters to upper-case.

So visitors get a 404 error instead.

That’s not what you’re looking for. Make sure all of your URLs are in lower-case. That’s the industry standard these days and it will probably stay that way for quite some time.

  1. Multiple Versions of the Homepage

You should have one (1) homepage and that’s it.

If you do have multiple homepages, redirect them to a “master” homepage. That will fix the problem in a jiffy.

Also, if you have one homepage for mobile visitors and another for desktop visitors, your website is about six years behind the tech curve.

Instead of going that route, opt for a responsive homepage. That means the layout of the page adapts to the screen size no matter how big or small.

You’ll need to get in touch with a development team to make that happen, though.

  1. Using Query Parameters

There are times when query parameters are helpful. That’s usually the case when you’re running an ecommerce site that offers filtering capabilities.

However, if you’re just delivering content to visitors, query parameters could spell trouble.

Why? Because they could use up your crawl budget.

Often times, a base URL with different query parameters will send folks to the same web page. That means search bots are hitting the same page with different URLs.

If you use up your crawl budget before your entire site gets crawled, then some of your pages might not get indexed.

  1. Soft 404 Errors

Beware of “soft” 404 errors.

A soft 404 error occurs when a visitor on your site encounters a 404 error but gets redirected to another page. When that happens, the server returns a 200 code, which means everything is okay.

The problem with that scenario is the page the visitor got redirected to will get indexed. That’s not what you want as the page probably has a big “Whoopsie!” message or verbiage to that effect.

Bottom line: make sure your web app is returning the 404 status code when the user hits a bad URL.

  1. Outdated Sitemap

If you’re using a quality content management system (CMS) with a sitemap plugin, you probably don’t need to worry about this issue.

However, if you’ve still got an “old school” website that you maintain manually, then you’ll need to make sure that you keep your sitemap up to date.

Fail to do that and the search bots might not discover all your content.

So if it’s been a while since you last validated your sitemap, make it a point to do so today.

  1. Robots.txt File Error

What do you know about robots.txt?

If you’re new to SEO, it’s the file that gives search bots info about crawling your site. You can use it to tell search engines which pages to index and which to exclude from the index.

But you really need to make sure the file is configured correctly. Otherwise, search bots might get confused and bail on your website.

Run the contents of your file through a robots.txt testing tool just to be sure everything is working correctly.

  1. Failure to Use HTTPS

In this day and age especially, folks are concerned about security. That’s just one reason why your website should use the secure HTTPS protocol instead of HTTP.

Believe it or not, though, using HTTPS also gives your site a boost in search rank. According to Google, it’s a modest boost, but that’s still a boost.

Fortunately, it’s inexpensive to switch to HTTPS. An SSL certificate will only cost you about $ 5-$ 10 per year. You can spend even less than that if you get a long-term certificate.

  1. Slow Load Time

Google is all about giving your visitors a happy experience. One way you do that is by serving web pages quickly.

If your pages take too long to load, you’ll not only frustrate visitors, you’ll also likely take a hit in rank. Contact a development team to speed up your load times.

  1. Poor Mobile Experience

In case you missed the news, Googlebot is now using a mobile agent to crawl your website.

What does that mean? It means that Google will “see” your site as though it were browsing it with a smartphone or tablet.

So your site had better look great on those platforms.

If it doesn’t, expect your rank to suffer.

The solution: adopt a responsive design. You’ll have to outsource that task to a qualified development team.

  1. Black Hat SEO

Black hat SEO is cheating. It’s a way to manipulate the search engines to get a good rank in a hurry.

If Google finds out that you’re doing it, your site might get penalized. When that happens, you lose visibility.

Since losing visibility is the exact opposite of what you want, it’s best to play by the rules. Optimize your website by providing content that people find valuable. Then, watch your rank grow organically.

  1. Using the Noindex Directive

Did you know you can block search indexing? In some cases, people do it by mistake.

The “noindex” directive in HTML will prevent Google from indexing the page. That means it won’t show up in search results.

Sometimes, a “noindex” directive is left on a page by mistake. People don’t want it indexed while it’s still under development.

The problem is that web developers copy and paste the code from that page to a new page and forget to eliminate the “noindex” directive. As a result, the page never gets indexed.

  1. Continued Use of “Rel Next” and “Rel Prev”

You might have some content on your website that’s so long it requires multiple pages. If that’s the case, you might be tempted to use the “rel next” and “rel prev” directives.

You don’t need to do that because Google doesn’t support those directives any longer.

Go through your website and check for the existence of old “rel next” and “rel prev” directives. Then, remove them.

  1. No HTML Base

Are you using a state-of-the-art web technology like Angular or React to deliver your site? If so, then you’re facing a challenge.

That’s because those technologies have no HTML base. Visitors who turn off JavaScript won’t see your website.

And that’s definitely not going to help you reach those customers.

If you’re in the early stages of choosing a web technology, pick something with an HTML base.

If you’re already using a JavaScript framework to deliver your site, use dynamic rendering to make it easy for search bots to find and parse your content.

  1. Development Website in the Index

This is the opposite of the “noindex” problem I described earlier. Sometimes web developers unintentionally leave a page under development online for indexing.

That means people will see your “in progress” work and it won’t always be pretty.

Make sure you use the “noindex” directive for pages you don’t want people to find in organic search results.

  1. Multiple Live Versions of a Page

Some webmasters leave multiple versions of the same page online.

In some cases, the web server will deliver one version of the page when the URL has a trailing forward slash and another when there’s no slash.

Similarly, sometimes a URL that ends with .php will deliver something different than a URL that ends with .html even though everything else in the URL is identical.

That’s going to confuse your visitors as well as the search bots. Make sure all your pages with the same root URL deliver the identical content.

  1. Duplicate Content

You certainly won’t win a lot of friends and influence people if you’re branded a plagiarist. You also won’t rank well in the SERPs.

That’s because Google frowns on duplicate content.

Run your content through CopyScape. Ensure that it’s at least 70% unique. If it isn’t, get to work on some rewrites.

Also, resist the temptation to copy product descriptions on your ecommerce site from other sources. While that’s not technically plagiarism it’s still duplicate content.

Invest the time (or money) in rewriting those product descriptions. Use that opportunity to produce more marketable product details so you reel in customers that weren’t impressed with descriptions on competitor sites.

Wrapping It Up

It’s easy to overlook many of the technical aspects of SEO. When that happens, you limit your reach.

Go over the pitfalls I’ve described above. If any of them apply to your website, put a plan in place to take corrective action.

Digital & Social Articles on Business 2 Community

Author: John Lincoln

View full profile ›

(33)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.