window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'UA-141372662-1'); Don't Make these SEO mistakes to Rank Better - ads
Today: Thursday 3 October 2024 - 11:52 AM

Don’t Make these SEO mistakes to Rank Better

time November 12, 2017
Don’t Make these SEO mistakes to Rank Better

What are the most popular SEO issues in 2017? What are the most serious mistakes? SEMrush has collected anonymous data on 100,000 sites and 450 million web pages using the SEMrush Audit tool to determine the most common errors of SEO ,you will discover the most common SEO mistakes categorized into three groups: website crawlability and architecture, on-page SEO and technical SEO.So lets dive deeply!.

1. CRAWLABILITY AND SITE ARCHITECTURE 

in simple words,Crawlability is the ability of search engines to crawl and index your website, there is no point optimizing anything on your site if the search engines can not see it. For a site to appear on a search engine like Google, it must be crawled and indexed by it. That’s why the crawlability and indexability of a site are the key elements that can wipe out all other SEO efforts.

To improve navigation and understanding for both users and crawl bots, you need to build a well-organized site architecture. A site well done for SEO is equivalent here to a nice site for the user. To achieve this result, you need to streamline the structure of your site, and make sure the most effective content is available (up to 4 clicks from the home page). 

ROBOTS.TXT

Robots.txt can prevent Google from crawling and indexing the entire site or specific pages. Although having a robot.txt is not essential for the well-being of a site, it can help to increase the crawling and indexing speed. But beware of these SEO mistakes ! Otherwise, Google may ignore the strategic pages of your site or index small pages.

The most common problem:

  • Format errors in robots.txt – 3.69% of analyzed sites have this problem. Exclude from crawling temporary pages and private pages that are visible only to certain users or administrators, but also pages without interesting content.

If you are interested in learning more about robots file ,take a look at the Google manual on Robots.txt . And if you want to validate an existing file, you can use the robots.txt test tool . 

LINKS AND REDIRECTION

Too many links on a page arouse distrust of both users and crawlers, who will not follow all the links anyway. Keep in mind that misused nofollow attributes can be harmful, especially when used on internal links.

The most common problems:

  • 4XX errors (80.17%) and 5XX errors (10.01%). Having links on your site is necessary to guide the users and to make enjoy other pages of the ranking of the landing page. But broken links, status codes 4xx and 5xx can hurt the user experience and your SEO efforts;
  • Broken internal links – 33.29%. Scrupulously check your links, replace or delete those that are inoperative, and in the case of server errors, contact your host;
  • External links broken – 28.89%. If you have broken external links, contact the site owners;
  • WWW domain misconfigured – 16.98%. Remember that search bots can consider a site with WWW and without WWW as two separate domains. You must therefore set 301 redirects to the preferred version and indicate it in the Google Search Console;
  • Redirection chains and loops – 5.49%. Redirect strings and loops confuse crawlers and frustrate users by increasing the loading speed.

SITEMAP 

Submitting a sitemap to Google Console is a great way to help bots navigate your site faster and get updates on new or changed content. Almost all sites contain utility pages that are not in the search index. Sitemap allows you to highlight the landing pages that you want to position in the SERP. Sitemap does not guarantee that the pages listed will be indexed or ignored by the search engines, but this makes the indexing process easier. 

The most common problems:

  • Format errors in sitemap.xml – 13.19%. You can create an XML sitemap manually, build it with CMS or a Sitemap generator tool . Search engines only accept sitemaps of less than 50 MB and contain less than 50,000 links, so if you have a large website, you will need to create additional sitemaps. You can learn more about how to handle multiple sitemaps in this guide;
  • Bad pages in sitemap.xml – 10.92%. Obviously, there should be no broken pages and redirects in sitemap.xml or shells in links. You must also avoid having listed pages on your site to which there are no internal links. If you have identical pages, keep only the canonical in sitemap.

2. On-PAGE SEO

On-page SEO is about improving ratings on specific pages by optimizing its content and HTML code. Even if it’s tedious, you have to shape all the ingredients of a page to attract more relevant traffic. The alliance of textual and visual content, coupled with a good work behind the scenes, will generate user satisfaction and search engine recognition.

CONTENTS

We know that a good SEO implies good content. Revised content, or worse yet copied, is rarely of interest to the user and can have a significant negative influence on your rankings. You should therefore check your site to determine the presence of identical or nearly identical pages, then delete or replace them with single pages. We recommend that pages have at least 85% unique content. 

The most common problem:

  • Duplicate Content – 65.88%. To avoid cannibalization, indicate the secondary pages with the rel = “canonical” tag pointing to the main page.

TITLE TAG

The importance of the title tag is obvious: it has a huge impact on the first impression that sees your page in the search results.

The most important problems: 

  • Missing or empty tags – 10.53%. For each page, create attractive, and especially unique, meta titles to guide users and crawlers;
  • Duplicate title tags – 53.23%. Duplicate titles can confuse users who will not know which page to follow.

META DESCRIPTIONS

If the title tag is a cover on which your page is judged in the search results, the meta description is the back cover that sells it for a click. Writing a hard-hitting, clear page summary is quite an art, but keep in mind that having copy-and-paste meta descriptions is worse than not having them at all.

The most recurring problem: 

  • Meta duplicate descriptions – 53.99%. They can prevent crawlers from determining the relevance and priority of a page. You can use SEOmofo to preview your titles, descriptions, and URLs in the snippet on Google’s SERP.

IMAGERY

Image search is not new, and while a good position in the SERP images can attract a large part of your target audience, site owners continue to neglect it.

The most important problem: 

  • Internal (8.66%) and external (5.25%) images broken. Google may decide that your page is badly coded and maintained if it contains broken images. You must periodically inspect your site to verify and reinstall or remove problematic items.

3. TECHNICAL SEO

Managing technical SEO issues such as slow loading or improper mobile optimization is essential for the positioning of a website. A poor page performance can hurt all the good SEO work you have done elsewhere it considered SEO mistakes.

PAGE SPEED

Page speed is an important factor of high ranking, it depends on both the server and the performance of the page. It obviously has a big influence on the bounce rate. So you need to optimize HTML, reduce scripts and styles, and reduce the weight of the page to a minimum.

The most important problem:

  • Overloaded HTML – 0.98%. One way to optimize HTML is to use compression tools like gzip or deflate. The compression of HTML, CCS and Javascript can produce an excellent effect on the loading speed, but there are some disadvantages: the setting is complicated and there are sometimes problems with some browsers.

MOBILE

We all know that the share of mobile traffic is growing at a faster pace. The number of smartphone and tablet users has exceeded the number of computers for several years now. This means that having a mobile-friendly site is absolutely necessary. 

  • AMP pages have no canonical tag – 0.08%. If you have an AMP version of your page, make sure it has a canonical tag, and that it is referenced on the non-AMP version. This will avoid duplicate content problems. If you only have one AMP page, add a self-referencing canonical tag;
  • Missing view port tag – 0.66%. If a page does not have a meta viewport tag, mobile browsers will not be able to find the optimized version of a page. They will show the one for computer, with an inconsistent font and images in an inappropriate format.