10 MOST COMMON SEO ISSUES AND HOW TO SOLVE THEM ​

A website grows as you publish content and do touch-ups here and there. At its launch, you might have done everything perfectly to have a website with the potential to rank very high. Along the way, some issues will begin encroaching.

They might not be notable or you might ignore them thinking they have no major impact on your website but the truth is, they do have an impact and if they are not fixed sooner, it could lead to a downhill path very fast.

Working on a page to rank highly is a very engaging task and it takes its sweet time to rank, but going downhill is quicker than a Formula 1 race. You don’t want to find yourself in a position of stagnation or plummeting after all that work.  Besides, other SEOs are working on their websites, why shouldn’t you?

PRO TIP: ALWAYS KEEP CHECKING FOR SEO ISSUES ON YOUR WEBSITE PERIODICALLY.

So, what are these SEO issues that we’re talking about? They are both technical and non-technical issues.

Here is a list of the 10 most common issues affecting SEO:

  • Page speed
  • Website architecture
  • Duplicate pages
  • Broken links (internal and external)
  • Mobile-friendliness
  • Redirects
  • Website optimization
  • txt
  • Sitemaps
  • Crawling

1. PAGE SPEED ​

You have been on Google searching for an article or page and then, Bam! You think you have found the perfect one from the snippet. You click on it and to your surprise, the page is just loading and loading. How did that make you feel? Probably closed it or opened a new tab and left it to load. You might go back to the page or forget about it completely.

The slow loading speed will also eat into your crawling budget. Not as many pages will be crawled as they could have before. This means that Google might not be able to display some of your pages since it has no idea of their existence or Google won’t be able to know if there are any changes to the pages not crawled. The average loading time for a good page is 3 seconds. For e-commerce pages, most conversions happen on pages that load between 0-2 seconds. That’s right, no one has time to wait for your page to load. If it’s too slow, then it’s on to the next.

There are several online tools (free and paid) that could be useful in testing your website and knowing its page speed. Some of the tools are Solarwinds Pingdom, Google’s PageSpeed Insights, Google’s Test My Site (mobile sites), WebPageTest, MOZ, Google’s Lighthouse, and many others.

Some of these tools will go a step further and state the specific issues that are affecting your website’s speed.

The most common issues that affect website load time are; redirects, large files (images, videos, and attachments), CMS plugins, bandwidth, choice of the website host, extra page elements (fonts, sliders, etc.), number of pages and many more. The list is not exhaustive but here are the most common ones.

SOLUTION

The more you continue using your website, the more it grows. At first, it may have been very fast, but what happens when you continue uploading files? The website becomes heavier. This SEO issue manifests itself very fast when not curbed. Compress files before you upload them to your pages.

A flashy website is very appealing to the eye, what you don’t know is that all the custom elements added will make your website heavier. Keep the elements minimal but have a beautiful design.

When it comes to plugins, delete all the unnecessary plugins. It will save you a fraction of a second which may not seem a lot but adds to the overall speed.

When choosing a website host, you have to know how dependable they are. We always say that cheaper is expensive. Look for a host within your budget and do a background check to know if they are reliable.

Bandwidth is very important. It dictates how many people can visit your site without having issues. As a website grows, increase your bandwidth whenever necessary. You might start with 100 visitors a month and down the line, you have 10,000 visitors a month. Monitor your website and decide when the right time to bump up your bandwidth is.

Redirects mean your users have to wait for your page to load to another page. Minimize the number of redirects on your website for a better user experience.

2. WEBSITE ARCHITECTURE

Website architecture is your website’s layout, and how your pages are linked. The website architecture should make it easy for the users to find information, moving from one page to another, it will also help in crawling. Breadcrumbs could be of help so that your user can know where they are at the bat of an eyelid.

URLs are a big SEO issue here. Have you seen those URLs that look like this; https/shoes/product/019474d89s9d? This is a very poor way to display your URLs. They might think that this leads to something unrelated to what they are looking for. This might cost you conversions, especially on e-commerce websites.

Poor website architecture also means poorly linked pages will not be crawled optimally. Google aims at giving a good user experience to its audience and on the realization of a poorly designed website, they will not rank you high. You might have the most amazing content but if your audience will have to maneuver on your page to find it, then it’s worthless.

SOLUTION

Ensure you have a good hierarchical structure. It should be well linked for crawlers to find all your pages and users to easily navigate.

Have a good menu that is displayed on your pages for easier navigation. Does a user want to go back home? It is right there on the menu from whatever page on your website.

Have a good consistent design. Do not keep changing the design of your website page after page. It looks shoddy and the inconsistency might make the users leave. Design from page to page should be familiar.

URLs should be simple and user-friendly with a bit of information. An example, instead of having a link like this; https/shoes/product/019474d89s9d, name the link in words users can understand. This link could be as such; https/shoes/product/nike_slides. This way a user can tell that this link will take them to a page with slides from Nike.

Website architecture is a technical SEO issue, have professionals handle what is too technical for you.

3. DUPLICATE PAGES.

Duplicate content means pages that are similar to the core or have a very huge percentage of similarity. Duplicate pages confuse Google on what page should be displayed. This is a very old technical SEO issue and continues to bother SEOs. 

Duplicate content may come from having the same version of the page on different protocols (HTTP and HTTPS). Dynamic websites might create several pages or URLs that contain the same content. For international websites that have websites in different languages, this might lead to duplicate content too.

SOLUTION

Use the attribute rel=”canonical” to tell Google which is the preferred page.

It is advised to use the robots.txt file to tell Google which links to crawl and which not to. This directive could help you to stop crawlers from accessing duplicate files.

4. BROKEN LINKS

The more pages you have, the higher the probability of having broken links. This is because you will link both internal and external pages. You have no control whatsoever over the external pages that you link. They may decide to take down or redirect a page you linked to. You might decide to also do the same for pages on your website.

The biggest SEO issue with this is that you will not easily notice if you don’t do periodic audits on your website. Websites with many pages are even harder to control and find out what is going on with their links.

Poorly redirected pages will also cause major SEO problems as the crawler will report a dead-end when they find such. You should know what redirect to use for what situations so that when the crawlers and users go through your page they could understand what’s going on. Users finding numerous broken links, especially on your website may lose trust and will offer a bad user experience. Your crawl budget will also be majorly affected by these broken links.

SOLUTION

Luckily, fixing this SEO issue is not complicated. Use website auditing tools to give you the broken links. Large websites cannot be manually audited and why do it manually for smaller websites when we have free tools to audit your website?

Remove all broken links highlighted. If they are your pages, use the right method to redirect them.

When redirecting pages, you have to know the aim of it. If it is a temporary issue, use a temporary redirect. If you are completely done with the page, use a permanent redirect to any relevant materials or your home page. Do not use one in place of the other lest you mess with your ranking. It will not happen immediately but it will happen over a period.

5. MOBILE-FRIENDLINESS

Mobile-friendliness is a very crucial ranking criterion. Google implemented a mobile-first indexing policy a few years ago. This shows you how important mobile compatibility is for your websites. 54% of the global website traffic comes from mobile devices. Most of us are on our phones and when we are searching for something online, we just switch to search engines directly.  A greater part of improving your SEO is to have a conveniently designed website.

You will be losing a lot of traffic once users discover your page but they have to struggle to see, click and scroll your website. It goes without saying, the spiders are after you. Some might prefer to have separate web pages for their mobile users but that will be a compromise. Losing out on all the link juice you could have and users might feel a bit insecure being redirected to another page automatically.

SOLUTION

Such technical SEO issues require you to engage your developer to assist you in solving them. The best way to handle this is by having a responsive page. This is a page that automatically adjusts itself depending on the screen size. This way, you do not have to have separate pages for desktop, tablet, and mobile users.

Ensure the elements on your screen are clickable comfortably. They should not be too small or too big but just the right size for a user to click them without interfering with the other elements.  Pictures and videos should fit perfectly on the screen size too.

6. REDIRECTS

As said above, a big website with many pages is hard to monitor. These SEO issues will not have a direct impact on the feel of your website until you find yourself on these broken pages. Redirects help us inform users of what is going on and if the page is a temporary or permanent redirect and takes them to the relevant functional pages.

The problem is that the more the redirects on your pages, the slower it is to load. This will mess up your crawl budget and have a general effect on your user experience.

SOLUTIONS

Don’t get me wrong, use redirects whenever necessary. They are life savers when you cannot permanently delete a page or a page is under maintenance. However, keep the redirects to the minimum. This will let crawlers crawl more pages.

7. WEBSITE OPTIMIZATION

Website optimization in this article will mean the files that you upload on your website. These are blogs, images, videos, etc.

The naming of files you intend to upload is very important. E.g. if you want to attach a file that has documents of shoes, name the file shoes, shoes for sale, and naming that will increase your chance to rank for a search term similar to what you want to be found for.

When uploading images and videos, do the same. Name them accordingly. This applies to links

8. ROBOTS.TXT

This is one of the most important files you can have on your website. The robots.txt file gives directives to crawlers on what pages they can access and those they can’t. You might find a page on the file that you have all the intentions to display to the public but have crawlers blocked. Sometimes we don’t want some pages displayed. Some of these pages are sensitive documents, pages that you intend to display after a user carries out a certain action, etc.  

If you have a problem generating one, seek professional help to avoid giving unwanted directives. The robots.txt should be placed in the root directory.

SOLUTION

Always review your robots.txt file. It can be found by simply typing the URL to your home page and adding ‘/robots.txt’ after that. Below is a snippet example of Google’s robots.txt file.

SEO Problems

Pages that add no value should be disallowed. Sensitive pages should also be disallowed.

9. SITEMAPS

Sitemaps are like treasure maps to the pages you want to be found. Just like the robots.txt file, they should be placed in the root directory. Big websites might have several sitemaps which are placed inside a sitemap index. The wrong format and the wrong placement might cause SEO issues for your website. If you are not conversant with it, use professional services to generate one. There are many free sitemap generators but you have to do some editing before uploading them.

10. CRAWLING

In our previous article, we covered crawling in depth. Issues that crawlers face and how to sort these issues. Kindly take a look here: What is Website Crawling?

Some of the SEO issues are very technical and need a knowledgeable hand to fix. Look for professional help to fix your website. You could reach us through info@savannahdatasolutionslimited.com