5 Misconceptions about which not many people know optimizers

5 Misconceptions about which not many people know optimizers


Very often on forums and blogs SEO novice misleading and then they follow the rest of his life to these principles, bringing this knowledge and passing them on from generation to generation - the so-seo myths are born.

I have collected 5 most popular facts, fallacies, recommendations or as they are called ... that most SEOs used or interpreted incorrectly.

1. 404 or 410 server response

Many used to remove pages from the index - 404 server response. Properly use 410 server response. He reports that the resource is permanently removed. Thus unnecessary pages quickly depart from the index.

2. The exact occurrence of the keyword in the title and text

This is one of the main principles to be almost all optimizers - to be in the top, must necessarily exact match keyword in the title or text page.

Let us turn to the research undertaken by NetPeak.

It was taken about 13,000 HF and MF keys. Struck him on the top 10. And look how many pages have the exact entry key in title or text.

Turned out here are the numbers:

60% of the sites do not have the exact entry

28% have the exact entry in the title

12% exact match only in the text

Hence the conclusion that exact match is not required.

3. Robots.txt and the prohibition of indexation

I was an experiment. Entirely on the new domain created the site with robots.txt which had been prescribed a ban on indexing site «Dissalow: /». Within a month, I bought it in a lot of traffic, as a result of site indexed in Google.

Therefore, the robots.txt file is a recommendation. If you want to close the website or page from being indexed, make better use of meta tags.

By the way.

Few checks, but common, especially in samopisnyh engines that the page http://site.ru/robots.txt gives no response from the server 200.

It would seem, why do we need it to check if it is clear that the robots.txt is not working. It works, but the impact of strong behavioral factors or reference search engines do not always follow it.

4 rel = nofollow and internal links

A long time ago, closing the internal links on the site rel = "nofollow" could manage weight pages. Many of these abused and why Google decided to do so - the weight goes from the page, but it is not transmitted.

Putting on an internal link rel = nofollow Page Rank you take a page on which stands the link and pass it in anywhere.

Therefore, for internal links, it is better not to use the rel = "nofollow".

5 Sitemap.xml

From childhood we are taught that accelerates sitemap indexing your site and it should be done each site, shoving all the links there. If you have a small site and the normal structure, then it will not sitemap.xml well indexed. But if you have a huge portal with 100,000 pages and you shove them all in sitemap.xml, it does not speed up indexing.

Therefore, the best solution - add sitemap.xml not only indexed pages.

Comments

Popular posts from this blog

Binbox.io Review - Get Paid to Share your Links Online

How To Download Spotify Music For Free Using SpotDown

Huge Lists Of Onion Deep Web Links