Stop Believing SEO Myths and Learn the Truth

In the SEO business nothing can taken for granted. Every year new techniques and methods are introduced, new tools become available and several algorithmic updates take place. As a result, the circumstances of this uncertain and fast changing environment give birth to several myths and misconceptions. From time to time few of those myths get confirmed (such as the effect of Social Media in SEO), while most of them get debunked. In this article we will discuss some of the most popular SEO myths and we will explain why they are nothing more than misconceptions.

1. Keep a High Keyword Density

What one should do in order to improve his rankings is to use different combinations of the main keywords in the text. This will increase the odds of ranking for other similar terms or combinations without affecting the quality of the text. Note that this technique will increase the Keyword Density of the important terms in a natural way. Nevertheless its primary target is not to increase the density but to incorporate in the text the most common keyword combinations that users are likely to search.

2. PageRank Means Everything

For years several SEO professionals considered PageRank the most important factor that affected the Search Results. In many cases some of them confused the real PageRank values with the ones of the toolbar and they were focusing primarily on how to increase it in order to improve their rankings. Nevertheless, as we mentioned in a previous article, PageRank is not the only signal that Google use. It is just one of the metrics and in some types of search it carries very little weight (news search, local search, real time search etc).

3. PageRank Has Become Irrelevant

The last couple of years, more and more SEOs started to question whether the PageRank affects the SEO. This is mainly because it does not appear to be highly correlated with high rankings. Of course as we discussed in the article “Is Google PageRank still important in Seach Engine Optimization?”, PageRank is a signal, it is a metric that measures the quality/authority of the page and it affects the indexing. PageRank should be neither worshipped nor ignored.

4. You Must Submit Every Page to Google & Bing

Submitting every page of your website in Google and Bing by using their submission forms will neither help you speed up the indexing nor improve your rankings. If you want to reduce the indexing time, add links from high traffic/authority pages, use XML and HTML sitemaps and improve your internal link structure. Submitting one by one all your pages will neither help nor hurt your rankings.

5. Meta Keywords Help Your Search Engine Rankings

The Keywords metatags were important for the first META-search engines that did not have the computer power to analyze and store the entire page. Since then, search engines have been evolved and they are able to extract the important keywords of the page without using the META keywords. Another reason why search engines stopped using this tag is because many people were adding tοo many irrelevant terms in it. Google has made it clear many times in the past that they do not use meta keywords at all, so this tag will not help you improve your rankings.

6. Duplicate Content Will Get Your Website Banned by Google

Several people suggest that having a lot of Duplicate Content in a website can lead to bans. Fortunately this is not true. Duplicate content can cause serious problems and it can affect the amount of pages that get indexed, the PageRank distribution within the website and consequently the rankings; nevertheless Google will not ban your website for that. You can find more on our previous article “Duplicate Content: the effects on Search Engine Rankings”.

7. Nofollow Links Improve Your PageRank Distribution

In the past, by using the rel=nofollow attribute, we could manipulate the PageRank distribution of our website and perform PageRank sculpting. Nevertheless an algorithmic update of Google changed the way that rel=nofollow operates and now it evaporates the amount of PageRank that does not pass through nofollowed links. Thus as we discussed in the article “The PageRank sculpting techniques and the nofollow issue” the rel=nofollow attribute leads to the evaporation of link juice. If you want to retain control over your PageRank and avoid the evaporation you can use the PageRank Sculpting technique that we have proposed in the past.

8. All Website Links Have the Same Weighting Effect

In the original PageRank formula that was published by Page and Brin, all the links inside a webpage carried the same amount of weight. Nevertheless this has changed over the years and all the major search engines take into account not only the position of the link in the page, but also the relevancy and other characteristics that affect the CTR (font size, color etc). As a result footer links do not carry as much weight as links that appear on the top of the page or inside the main text.

9. HTML Validation Helps With SEO

Lots of webmasters used to think that by validating their HTML code they improve their SEO campaigns. Fortunately or unfortunately this is not true. The HTML Validation does not affect the Search Engine Rankings and it is not used as a signal. Nevertheless if your HTML code is so bad that parts of the page do not appear in the browser then Search Engines might have problems in extracting your text. Thus have in mind that producing a valid HTML code is a good practice, but in general minor mistakes will not hurt your SEO.

10. Using Nofollow Links Doesn’t Help Anything

Typically Google say that they drop from their link graph all the links that are marked with nofollow and thus they do not carry any weight. Nevertheless not all of those links are irrelevant for SEO. For example Twitter and Facebook links are nofollowed, nevertheless as we discussed on the article “Twitter & Facebook links affect SEO on Google and Bing”, Google and Bing use those data as a signal. So it makes sense to say that not all nofollowed links are irrelevant for SEO and that the major search engines might in some cases consider them during their analysis.

11. You Should Always Link Every Page to Every Other Page

Some people suggested that by linking all pages to all pages you can improve the indexing or the rankings. So in order to achieve this they use too many secondary menus or footer links. Nevertheless by doing so, you increase dramatically the number of outgoing links per page and you do not pass enough PageRank to the important webpages of your site. Typically websites should use a tree-like structure that enables them to focus on the most important pages. More information on this topic can be found on the article “Link Structure: Analyzing the most important methods”.

12. The Robots.txt File Can Help Solve Duplicate Content Issues

The Robots.txt file can be used to prevent Search Engines from parsing particular pages or segments of a website. As a result, some SEOs have tried to use this as a way to reduce the amount of duplicate content that they have on their websites. Nevertheless by blocking these pages, you prevent Google from crawling them, but you do not improve your link structure which causes the problem. As a result since the problem remains unsolved the negative effects on the rankings continue to exist.

13. Having Low Quality Sites Linking Back to Your Website Can Hurt Your Rankings

Several SEOs have stated in the past that adding low quality links that come from link farms can actually hurt the SEO campaign of a website. If this was true then people would be able to negatively influence the websites of their competitors just by adding to them low quality links. Fortunately though, Google will not ban a website for getting low quality links. Nevertheless in extremely rare cases, Google has taken measures against websites that tried systematically to manipulate Search Engine Results by artificially increasing the number of their backlinks.

14. Links of Any Quality Can Help Improve Your Rankings

Major search engines use several methods to detect paid or low quality links and they exclude them from their link graphs. The recent Panda update (or farmer update) made it even clearer that acquiring links from low quality websites/link farms, that contain a lot of duplicate or scrapped content, will not help to achieve high rankings.

15. The Page Title and Description Appear in Snippets

Several webmasters believe that the Titles or META descriptions that they use in their pages are always the ones that will appear on the snippet of the Search Engine results. This of course is not always true, since Search Engines can change the snippet title and description with something more relevant to the query of the user. Some other times, search engines can even use text that does not exist in the landing page. Usually this text has been retrieved from external sources such as the DMOZ directory or the anchor text of the incoming links.

16. Pages Blocked Within Your Robots.txt File Will Not Appear In Search Engine Rankings

Another common mistake that many SEOs make is that they use robots.txt in order to ensure that a particular page will not appear on the SERPs. Nevertheless this page can appear in the search results if it is linked by other pages. As we discussed on the article “The robots.txt, META-robots & rel=nofollow and their impact on SEO”, the proper way to ensure that a page will not appear in the search results is to use the “nofollow” meta-robots directive.
blocked-by-robots

17. All Your SEO Efforts Should Be Geared For Google Only

Google might still be the market leader in search, nevertheless we should not forget that Bing and Yahoo hold more than 30% of the total market. The Search Engine Optimization techniques do not focus on optimizing the websites only for Google, but they target on increasing the organic traffic from all the search engines and on developing websites that are attractive both for the users and the search engines. Note that there might be some methods that work better for Google, nevertheless a solid SEO campaign should be effective for all the major search engines.

18. SEO Requires a Really Long Wait Before You See Big Results

The SEO is neither a process that will deliver results overnight nor a one-time activity. To achieve good results it requires effort and time. Nevertheless, positive results can become visible relatively fast. Of course a new website will not be able to achieve immediately good rankings on the highly competitive terms; nevertheless it should be able to rank for the more targeted and long tailed keywords.

19. SEO is Unethical and Should Be Considered SPAM by Search Engines Like Google and Bing

The SEO is an online marketing technique/process that can help websites increase their organic traffic, their exposure and their sales. In order to achieve this SEO professionals focus not only on the technical characteristics of the website but also on the content, on the designs and on external factors. The SEO is a marketing tool just like advertising. If you consider SEO unethical you should also feel the same about advertising in general.

20. SEO is DEAD and You Should Stop Caring About It

Every year, a major update takes place in the Search Engine business and several bloggers or journalists suggest that SEO is dead. Nevertheless as you can see SEO is alive and kicking and it is constantly evolving along with the Search Engines. Certainly the techniques have changed a lot, new tools and the methods become available while other ones are no longer used. SEO is a relatively new form of Marketing and it will exist for as long as Search Engines exist.

Add a Comment

Your email address will not be published. Required fields are marked *