5 Technical Errors That Affect Google Rankings

This article will consider 5 technical aspects that affect websites ranking in Google. When these five problematic areas are brought to an ideal state, the website will get a better chance to be better ranked in organic SERP (search results page) and thereby will be better than its competitors.  

1.     Website load speed

Website load speed is among major factors of websites ranking in Google today.

No one likes, when having used the search and gone to the website of interest, it starts to load slowly. As a result a wish to close it without waiting for the page full load appears.

There are following types of a website load:

— good speed – faster than 1 second;

— average speed — 2 seconds;

— slow speed — more than 3 seconds.

The load speed may be checked with the service http://tools.pingdom.com/

There is also a special service of Google:

https://developers.google.com/speed/pagespeed/insights/

Having entered the website name to a string of checking and pressed “check”, you are offered with information like: which scripts affect the load; which images should be optimized; if there are more ways to increase the load speed, etc.

One more point that affects ranking slightly is the server location. If you want to promote the website in the USA region, its server should be located in that country. It is also necessary for the server to have a minimal response time and be always available.

Conclusions: a slow website load affects the website search promotion in a negative way.

2.     Mobile friendly

From April 21, 2015 Google search system started to appreciate websites displayed properly on mobile devices.  

It is important for two reasons:

- better conversion;

- better ranking.

When coming to your website from mobile search results, a visitor may find the information he/she needs and satisfy his/her needs.

For example:

- to read an article;

- to order a service/product;

- to subscribe to a newsletter;

- just to find out the phone number or to call right away.

That’s why the factor of being Mobile friendly became very important for websites promotion in 2015.

What is the way there to check and make a website displayed properly on all types of devices?

Services for checking:

http://responsive.is/

https://www.google.com/webmasters/tools/mobile-friendly/

If the website is poorly displayed on mobile devices, it should be made properly displayed in all major formats.

An example of a website optimized:

Conclusions: while the share of visits from mobile devices channel is growing (according to data of various services this share in the United States has already reached 60%), the website should be properly displayed on mobile devices to be ahead of its competitors.

3.     Duplicates of Meta tags, empty Meta tags

It is possible to learn whether the website contains these errors by using http://netpeak.us/software/netpeak-spider/ in the appropriate section of Google Search Console.  

Google Search Console

“Title” tag is one of the most important factors of rankings and that’s why each page should have its unique title. If several pages have one and the same title, search systems won’t be able to define the page to rank in the search results and, as a result, all those pages having the same titles won’t be ranked properly.

If the tag “Title” is empty, a search system won’t be able to rank it at all.

“Meta description” also plays a significant role. Each description of a page should be made unique, as “Meta description” is shown in the search results as well and describes what the page is about. More attractive it is - more visitors from the search the website will get.

If the tag “Meta description” is empty, the search system will pull out bits of text from the page on its own and form the description.  After such a self-generation not quite appropriate text may be chosen.

4.     404, 503, 500 errors. 301 redirect

It makes it possible to learn if the website contains such codes of errors as 404, 500 and 503 using the program “Netpeak Spider” and “Search Console”. All these errors should be fixed as they affect the website indexing and its ranking in the search results much.

“404 error” informs that this page doesn’t exist or has been deleted. This page will be kicked out of index by the search system robot. A large number of such errors will scatter the static weight of the website, which will affect its general search engine rankings.

“404 errors” should be fixed with “301 redirect”, which means that from the deleted page (to which links from other pages or even websites lead) a redirect to a new page should be placed, thus the reference weight will be passed to a new page, and the search robot will not stumble over “404 error”.

“500 error” is Internal Server Error, it occurs if there are errors in the file “.htacces”.

“503 error” occurs if there are issues with the server on which the website is. For example, if the page has been temporarily unavailable and that was a moment when the search system robot was crawling it (according to Search Console). It happens that such errors occur when there is a server heavy load, and the website cannot be opened. If this error takes place for a long period of time, the page may drop out of the search engine index.

To solve these issues the support of a hosting company should be contacted to fix them. But it is necessary to choose a good hoster having a minimum ping and a high incessancy initially.

5.     Robots.txt, sitemap.xml

The file “robots.txt” provides search robots crawling Internet with important information. Before they go through the pages of your website, search engine robots check this file.  

That’s why it is important to write rules of the website indexing in this file. But this file and its rules are only recommendations for the robot and there are no guarantees that the closed pages will not be indexed.

Still this file should be configured as it is one of the foundations of the internal website optimization. It is important to configure it properly for pages and sections needed to be available for indexing by search engines and unwanted ones - closed.

To check correctness of the file “robots.txt” a tool of checking from Google Search Console may be used:

“Sitemap.xml” file presence on the website has a positive impact on the website indexing (if the sitemap is updated regularly when new pages are added to the website) and accelerates its indexing if the website has a huge number of pages. “Sitemap.xml” file should be added to Console, as Console, in its turn, will check the map for errors and will show the number of pages indexed.  

Conclusions: all these technical aspects should be brought to an ideal state in order for the website to prevail over its competitors in SERP, as from year to year the competition grows in all spheres.

Having the technical side in a perfect state makes it much easier to conquer Top. And it is much comfortable for the very users when the website is loading fast, when it is easy to read on the cell phone and no errors occur when you follow the links on the website. 

Yurii, senior SEO expert at Visual Craft