Don’t waste your time on eliminating silly mistakes – some of them may cost you a big money. LinksManagement conducted an analytical research of the most common mistakes made by our clients. Our analysts counted that this SEO Mistakes report can save you $11,700.
LinksManagement is providing it to you for free. Get prepared and run your SEO campaign without a hitch.
LinksManagement is providing it to you for free. Get prepared and run your SEO campaign without a hitch.
What You Get:
- The fullest list of the most common and costly SEO Mistakes You can save $11,700 of your budget if you avoid these mistakes in your SEO campaign
- You will pump your SEO skills and can run your own SEO campaign applying other webmasters’ experience
- Your website will get high rankings on Google much faster as you won’t spare your precious time and money on making mistakes
What Aspects This Report Covers:
Below you can find just a part of those SEO Mistakes that are listed in the report.
SEO mistakes as the main obstacle to effective site optimization
SEO is an important promotional tool for any online business. If your site is not presented in Google, you will have troubles finding new customers, interested in your services. About half of surveyed marketers, involved in the business, admitted that more than 25% of their customers find their site using search systems. But even if all of the basic SEO principles were followed and most common SEO mistakes excluded, your site still may not be indexed well (especially if you promote it for high-frequency queries). The main cause of that is minor bugs, which become major in highly competitive environment.
Common SEO mistakes when creating content
- Duplication / content theft. Content is duplicated when 2 or more pages intentionally or accidentally contain the same information. For search “spiders”, each unique URL, which they can find is a single site page, even if different addresses refers to the same document. Search engine robots usually find new addresses for the links on the pages, which they already know. Links can be both internal (within the site) and external, i.e. from another resource. Webmasters often create different URL-addresses that lead to the same document. Usually, this is not intentional, but the problem of content duplication occurs in any case. Duplicated content are especially spread in large, dynamic web-sites, but small sites are often faced with it as well.
The most popular websites platform is WordPress. Unfortunately, if you do not carry out fine-tuning of WordPress, and use the default settings, then you’ll definitely have some amount of duplicated content. You’ll meet problems with headings, tags, archives and some others.
Site owner knows about how much articles and offer columns are on the site. The owner of an online store knows how many products are in the assortment, how many categories were created and informative articles posted. The number of pages on the site should be approximately equal to the sum of these values.
- Worse page indexing. To load site information into the database, crawler spends resources – computing power and electricity. Search engines (SE) owners are trying to save resources, so do not spend them uselessly. Therefore, if the SE determines that the same information is located on many pages, it can stop scan and index the site. At best, SE spiders will stop re-scan the pages of the site as often as webmaster need; in worst case no new pages will be indexed, even if the information on them is completely unique.
- Increased likelihood of penalties from the SE. Sanctions (filters) from SE lower site position for 30-100 places, and stop traffic, coming from Google. Duplicate content increases sanctions credibility. Many webmasters mistakenly believe that the more pages are on the site the better. They’re trying to index thousands of duplicated pages or any other pages that are useless for users. For example, it may be pages with the results of internal site search on thousands of different requests. This practice is especially dangerous with harsh sanctions.
Full and partial doubles
Full doubles contain 100% identical content, and differ only by URL-addresses. Partial doubles is when the content of pages is in some way different. For example, a different small piece of text, pictures, or swapped text sections. Even if the content is used legally (with owner’s permission), “cooling” could happen. Let’s name some ways to solve the problem:
- Removal of unnecessary pages. This is the easiest and most stupid way. After the removal, in response to requests from these pages, the server will return 404 error. To manage with duplicate content in this way is advisable, if the page doesn’t have traffic and (that is more important) good links from external sites.
- 301 redirect. Solving the problem of duplicated content with 301 means to redirect users from duplicated page to the original. It is assumed that when 301 redirect is carried out, 90-99% reference weight pass to the new page. 301 redirect is done by editing the file “.htaccess”. But before you start, make a file copy. Save it on your local machine or on a remote “cloud”. When editing a file “.htaccess”, you have very small risk to brake the site work. Do not use 302 redirect because it transmits 0% of reference weight.
- “Robots” meta tag. Allows to control the behavior of SE robots at single page level. With it, you can disable the indexing of pages and/or scanning of links, using which “spider” could proceed to other pages.
- “Rel = canonical” directive. You can regulate robots’ behavior on your site. “Rel = canonical” is inserted in the page header. By “rel = canonical” webmaster defines canonical (“original”) version of the page. Experiments show that through this directive link juice is passed largely or completely.
- Blocking URL parameters in Google’s Webmaster. It blocks indexing of the pages, which URLs contain these parameters. This feature is located in Scanning -> Options URL.
- Internal linking. The best way to solve the problem is not to create it. Review site architecture and internal linking.
Up to date content should be effective not only in terms of classical SEO promotion, but also in terms of behavioral factors. If webmaster publishes a unique text with optimal keywords distribution, he will definitely involve traffic from SE. However, if such texts would not meet the requirements of their readers, the resource will soon generate adverse behavioral characteristics that will drop its position.
- Excessive optimization. This reason is often laid in the wrong determining of website performance metrics. Over-optimized site is not really relevant, it just simulates the relevance. Any excess of the measure reduces the conversion, as the headers become unreadable. Over-optimization is often considered as the same as spamming (content oversaturation by keywords). Search engines keep track of these signals and try to reduce positions of such sites. Google Panda primarily filters over-optimized texts, and then copy-paste and low-quality content.
- Lack of metadata. Many people understand the importance of metadata, but do not know how to work with it. Adding a pile of keywords and necessary steps to enable the resource search is not exactly what you need. Your content strategy (as well as for tags and other elements) must intertwined focus on the target audience and “searchers behavior”.
- Lack of adaptation to mobile traffic. Mobile SEO means site optimization for mobile search. Now, more than 50% of all internet users use smartphones, tablets and other devices for web surfing. Google has already begun to mark mobile-friendly sites. Such sites use one of three technologies:
- Adaptive design. It allows the site to use the same HTML-code, with no matter on what device it is displayed. You just use “name =viewport” meta tag. Such design displayed optimally on any device, regardless of its screen size. Adaptive design allows to use the same page address on different devices.
- Dynamic content. Another method of dealing with mobile traffic, when server sends different HTML and CSS for different gadgets. Use the HTTP header “Vary: User-Agent”.
- Different addresses (for standard and mobile versions of the site). This option implies that the visitors who enter your site from smartphone or tablet are automatically redirected to mobile version.
- Low download speed. In 2012, Google confirmed that website loading speed affects ranking. Basically, optimization assumes changing the speed of image loading, caching, compression, graphics and pages on the server.
- Focusing on the high-frequency keywords.
- Lack of user-friendly error page.
- Excessive attention to technical SEO. It is better to create and publish a new note than endlessly optimizing the code, installing unwanted plug-ins and trying to manipulate with the output.
- Site creating on the base of Flash without HTML-version.
- Lack of optimization for local search. It is primarily concerned with sites of organizations working offline.
- Use of too “heavy” images.
- Lack of updates on the site. Work on the site does not end with clicking on “Post” button. Creation of quality content, design revision and SEO are just some of the essential steps in creating a professional online project. Search engines like sites with active and constantly updated content. This can be done by using a thematic blog, social networks integration, or simply updating the main page.
Linkbuilding: top 10 SEO mistakes
- Links exchange.
- Automatic or manual site registration in directories. Search engines consider the emergence of dozens of backlinks as a sign of manipulation.
- Manual/automatic posting in the forums. This linkbuilding method is hopelessly outdated.
- Dofollow-blogs commenting. If you leave a few comments on different platforms, nothing will happen. However, to increase the credibility of your site will not happen as well. But if you leave hundreds of spam comments, your websitecan get sanctions.
- References in the site’s footer. If you are web-developer that provides internet marketing services, your partners often refer to you, and search engines regard such links as normal. But if you are selling spare parts for cars, or write about finance, the large number of links to footers looks suspicious. If links from the footers are not closed by “nofollow” attribute and have anchors with keywords, you can get sanctions.
- Guest posting.
- Lack of relevant outbound links. Put links that are open for indexing to authoritative sources. Natural linkbuilding is a two-way street.
- Abuse of links with optimized anchors.
- The use of unsuitable URL.
- Broken links. Periodically check site for the presence of broken links, using Free Link Checker or a similar tool.
Top SEO mistakes regarding to indexation
- Improper filling of “robots.txt”. If you do not want to close pages from indexing, simply do not fill this file.
- Prevention of the site indexing in CMS administrative console. Sometimes resources owners temporarily prohibit indexing and then forget to lift the ban.
- Absence of site map. Without it, the SE spiders can index your site incorrectly.
- Improper use of “noindex” tag.
- Lack of references to a new site or a page. If you’ve just created a website, make sure that it is referred by several other resources.
- Absence or incorrect use of redirects.
- Improper processing of error messages. If the user enters a non-existent address, the site should return a 404 error and redirect visitors.
- The use of multiple H1 headers on the page.
- Ignoring SEO when changing the design. Site update may sometimes lead to loss of important data.
Worst SEO mistakes: ignoring user’s interests
- Excessive amounts of advertising blocks, banners, or links. Do not publish more than three blocks on the page.
- Masking advertising as site content. You can suffer twice. Firstly, the text advertising system will deprive you from the pay, and secondly, search engines will drop site position.
- Publication of the invisible content.
- Involvement of non-targeted traffic.
- Lack of functionality and usability.
- The pursuit for TIC and PageRank.
- Incorrect code. Analysis service code is installed between the tags “<head>” and “</ head>”. Make sure you add the actual code.
- Incorrect evaluation of bounce rate. System Google Analytics counts the refusal, if a user viewed only one page. Bounce rate is important web metrics. It reflects the matching of a particular site page on the whole needs of the audience. Bounce rate expresses the percentage of users who have left the site after viewing one page. It would seem that if the failures are close to 100%, it can be argued that the page does not meet the expectations and needs of visitors. However, this is not always true, because user can read the manual for 15-20 minutes, got necessary information and close the browser. To smooth this contradiction, search engines and web analytics introduced adjusted bounce rate. Google Analytics bounce figure appeared in mid-2012. After adding the code in analytical service tag, you can arbitrarily set the threshold of failure, for example, at 15 or 30 seconds. In this case, the service does not include a failure, if the user is held on the page 15 or 30 seconds, respectively. Sometimes, after the implementation of the adjustment, bounce rate on the site are reduced from 80% or more to 20% or less. This allows the owner to evaluate the resource effectiveness more accurately.
- Focusing on metrics, but not on the users’ interests. GA reports should be analyzed in the context of the site’s marketing effectiveness. The measurements allow you to report visits, conversion, etc.
- Wrong selection of promotion efficiency indicators. If you’ve just created a site,
it’s enough for you to follow the basic indicators of the effectiveness of marketing campaigns:
- Number of unique visitors (users who visited your site for a certain time period; follow-up visits are not considered).
- Page View. This indicator shows the average number of pages, viewed by a visitor in a single session. Number of views in the majority of subjects should exceed the number of unique visitors (it can be considered an indicator of high-quality content: the audience is interested in your materials, so viewing more pages per session).
- Search engine traffic.
- Number of refusals.
- Conversion Rate. It shows the percentage of visitors, which performed certain actions that the website encourages to (for example, it may be a newsletter subscription). Conversion rate depends on the industry in which your business operates and on the goals that you are tracking. n average, the figure is 2-3%, but may reach 10-20%, and even more.
- Incoming natural links. The number and quality of backlinks affect the level of site confidence. Now search engines have learned to distinguish natural links from unnatural more accurately. We recommend you to monitor the number of natural links. Incoming links show that your marketing campaign is quite effective.
Wrong expectations when working with social networks
Use social networks (Facebook, Twitter, Google Plus). After all, the reputation of your site is affected not only SEO, but also by your social activity. If you will have an active page in each of these social networks, they can become an additional source to get traffic). The developers of social networks identified such references as “no follow”, but still they appear in the search.
Common SEO mistakes at website design
- Introductory pages. Bright introductory page can decorate the site or become a barrier between SE and the site.
- Too much of Flash. Flash is poorly indexed, but there is no doubt that it looks interesting. This does not mean that you need to remove all animation from the site. Just do not replace important information and navigation elements with flash-objects.
- Frames. Ironically, some web sites are still built on frames. This technology is completely obsolete. Moreover, frames interfere with SEO. As a result, SE do not always find important information on the site. Furthermore, the sites on the frames basis use three html-file instead of one, which lead to conflicts in indexing.
- Images instead of the important site elements. The simplest solution is to place text over an image by means of CSS.
- The lack of a backup navigation. Smart navigation matrix significantly improves site ranking. Text links are read perfectly by robots, and reflect the hierarchy structure of the site.
- Popup windows. Using the pop-up is a sign of bad taste. This is annoying element and it turns users against the site, so the SS do not index it.
- Ignoring the navigation standards. Proper navigation is essential for visitors and search engines. Creative elements without befitting technical implementation are perceived as a sign of poor internal linking that negatively impact the site ranking and the reputation.
Some useful tools to reveal SEO mistakes
- Google webmasters tools. GWT offers the following tools for the site promotion:
- “Site configuration”. Include subtopics: SiteMap files, access to scan, change of address, settings (you can set the base URL – with or without “www”, adjust scanning frequency and some other parameters such as SID, ID, etc.).
- “Your site on the Internet”. You can analyze the site structure, rankings, reference weight, etc.
- “Diagnostics”. Displays information about the viruses, indexing mistakes, etc.
- “Labs”. You can see how Google bot sees the site and to compare your site speed with the others.
- Google Analytics. GA is a Google service, which collects statistics about website
visitors and their behavior on the resource. At the time GA is free, but has a limit of 10 million
page views/month for a site. GA allows you to collect a lot of data:
- The number of visitors, distributed by time, region, technical means, channels and traffic sources.
- The duration and depth of browsing.
- Sources of transition.
- User behavior when performing the target action on the site.
SEO as a set of activities for effective site promotion
Effective optimization allows you to reach the top, and thus attract more customers. A significant effect is achieved only by a combination of on-page and external optimization. SEO is a set of activities, and only on-page onpimization, or only building backlinks is inefficient. External optimization is the final step, and it is aimed only at increasing traffic and website ranking in the search results. The most effective method of external optimization includes gradual link building, advertising campaigns and site promotion on the specified queries in popular search engines. This form of SE optimization is time-consuming, but final result is great. Backlink building is laborious and thankless job, with a lot of pitfalls. Linksmanagement do know what SEO mistakes to avoid, providing expert help in getting back links and bringing this SEO segment to perfection.