What is Google Webmaster Tools and how it can be helpful to SEO
There are many free SEO tools, but the GWT (Google webmaster tools wiki) probably is the most useful one. It has not all the functionality of commercial SEO packages, and it’s not a universal tool for all SEO needs, but it was created by Google and offers a variety of vital SEO ideas. GWT is a set of tools that allows to contact Google support and customize many other aspects. E.g., configure the list of links, parameters of indexation, check keywords entered by the users, got on your site, check clicks for each keyword, statistics and much more.
But the one specific question concerning GWT is a confidentiality. As you give Google full access to the statistics, you may be concerned about how they use this information. Google is unlikely to abuse you with the data they’ve got, but in order to keep the privacy, think twice before you allow Google (or other statistics service) to discover the secrets of your site (check the LinksManagement customers reviews about their road to success with and without GWT).
Interconnected Google Analytics and Webmaster Tools improve the results. You access to your data in GWT directly from GA, so if you’re using GA, we recommend you to integrate it with GWT.
In order to login and use GWT account, you should join the system and go through the verification.
How to use Google Webmaster Tools in terms of SEO?
There are 5 main methods to confirm the rights:
- HTML-file. Google provides a special file (verification code) that you just move to the root directory. After this, press the test button and get access.
- HTML tag. Using this option, you will need to place the specified meta tag inside <head> </ head>. If you then delete it, the ads will be displayed soon demanding reconfirmation of your rights.
- Domain name provider.
- GA. If you use the same account for GA and GWT, you can place an asynchronous code inside the tag <head> </ head> and gain access to GWT.
- Google Tag Manager. Allows you to use GTM to analyze the site.
Google did not immediately provide all the info (the test of time is a basic Google’s principle, which is applied the whole system functioning, including backlink weight transfer) – to get all the required data after you sign up for Google Webmaster Tools, you’ll have to wait for approximately 7 days. After confirming the rights, you’ll sing in and see a panel where you can review info about researched keywords, indexation and sitemap status:
By going to the site in GWT, you’ll see dashboard. It gathers the most important info about your site and Google messages.
- Messages. This section provides: 1) list of all messages concerning site problems 2) answers from the support service.
Messages are a very important section for SEO. When checking the GWT panel for your resource, you may see an incoming “bad” message that can easily spoil the mood for the whole day. But not all GWT letters are harbingers of doomsday.
- Informational letters. Info letters do not contain any danger. On the contrary, their main purpose is to warn the owner about any changes relating to his website. E.g., the developer changed the primary mirror or put the wrong geotargeting. In this case, you immediately will receive a message about the changes that will help to identify the wrong settings and quickly resolve the issue. The main types of info letters: The update is available for Joomla, Changes in website links, New geographic target, New verified site owner, The preferred domain name is changed, association a YouTube channel, connecting to Google Analytics, Disavowed links updated, Crawl rate change, Big traffic change for top URL,
- Technical letters. The main of them are: Possible faults in work of the site, Googlebot cannot access your site, Googlebot found an extremely large number of URL-addresses on your site, The growing number o false 404 errors, The growing number of “Not Found” errors, The growing number of server errors, notification about the presence of malware on the site, Phishing notification regarding the site, etc.
- Spam. Spam report is something that can really spoil the webmaster’s mood. The main types of spam letters: The request for re-examination of the site, Artificial inbound links, notification about the removal of materials that violate US DMCA law from Google Search, notification about clocking on the site, unnatural links on domain, etc.
By messages Google announces the imposing of manual or automatic sanctions on the site. The reason is bad links, cloaking, doorway pages. It is better not to use the techniques that are declared as “outside the law”, and optimize your resource using only “white” methods. But if you’ve already got under sanctions, we strongly recommend you to build backlinks with professionals to get quality help and quality backlinks, restoring site rankings.
- Search appearance.
- Structured data. A special markup for pages is used to generate attractive snippet in SERPs. Here is the entire list and the status of the structured data.
- Data Highlighter. This tool is used to highlight the elements that will then be assigned to a special marker, indicating that they belong to a particular data type. Properly marked up page is more presentable in search results
- HTML improvements. Shows errors in the page meta description. In case of a proper optimization you will see a message that there are no problems with the website optimization have been identified.
- Sitelinks (“disavow tools”).
- Search traffic.
- Search analytics. Contains a list of queries for which the site is ranked in search, clicks, CTR, and average rank for the selected period.
- Links to your site. All the references to your project that Google has found on the network, and which have not been removed by Google disavow tools due to the low quality.
- Internal links. Useful tool for weight distribution verification.
- Manual actions. If you have observed traffic loss – it’s possible that you’ve got manual sanctions. Here you will find a list of actions that Google accepted for your site, as well as a rough description of the situation, perhaps even with possible solutions (use our free SEO analysis to know whether it’s something to be fixed).
- International targeting. Set this tool in GWT panel if you want to customize the display of your site in different languages depending on the visitor’s country.
- Mobile usability. This section displays a list of errors that can occur when navigation through the site from mobile devices. Google drop in rankings not-mobile-friendly sites (the latest Google’s on-page demands you can check here).
- Google index.
- Index Status. This graph clearly shows pages that are in the Google index for 1 year, and pages, closed from indexing.
- Content keywords. Presents the most popular words on your site. During the optimization, it is desirable to pay attention to placing in the top of the page words of your semantic core.
- Blocked resources.
- Remove URLs. To remove any page from Google quickly, enter the URL address to the specified graph. This is useful when receiving complaints about your content (if you just remove URL, it will be stored in the search systems cache for some time; but Remove URLs is a direct request for the removal of the issue). A more universal option for all search systems is closing a page in “robots.txt”, but it takes time for indexing.
- Crawl errors. A list of errors that are detected by robot when crawling on your site. It is recommended to view this section every time you make edits in “robots.txt” and “.htaccess” files to prevent potential problems.
- Crawl stats. This statistics shows how often the crawler visits your site, how much info it has downloaded and how long it took. All this is conveniently served in the 3 charts with general statistics at the end.
- Fetch as Google. A very handy tool to check your site display in search. Some engines close stylesheets and Java-script in robots.txt, which is why you may see here text and links without any clear structure. Therefore, check the correctness of the site display both for PC and mobile devices.
- robots.txt tester. Any error in this file can lead to disastrous SEO consequences, so after you made all the edits, verify its correctness.
- Sitemaps. Shows the status of the pages that you add to index (for pages, added to sitemap in Google Webmaster Tools), errors and warnings.
- URL parameters. It’s used to configure the indexing of ULR addresses with specific parameters. That is, you can disable indexing all the pages that contain a “start” parameter in their address. Do it if you have a lot of uninformative pages, generated due to the work of some of the modules, or otherwise use robots.txt file.
- Security issues. Online hacking cases are quite common, especially among owners of unlicensed software. If you’ll have such problems, you’ll see messages regarding security problems, possible solutions and a request to recheck availability of the problems with the site.
- Other resources. Google also gives you an opportunity use its use other services for developing of your project.
How to fix SEO errors in GWT?
The reason for the importance of acquaintance with mistakes and meticulously monitoring them is their impact both on users and search robots. LinksManagement technologies allow to avoid basic SEO mistakes and to get relevant results.
The presence of 404 errors, particularly for URLs that have been indexed or referenced from other pages, may have a negative impact on the users’ attitude to the site , and as a result – you’ll have problems when selling links from the site or putting ad. If a person is faced with such errors for several times, it significantly reduces his confidence, which leads to the collapse of the project.
You should not lose sight of the external links. If you can correct the scan error and put a redirect to a working page, this link will have a positive impact on site position in SERPs (track site ranking here). Moreover, it is necessary to understand that Google generates a “scan budget” for each resource, so if most of the time the robot will crawl pages with errors, it simply did not have time to dig deeper and find valuable and important work pages of the site.
Scanning errors in GWT
- HTTP. This section typically contains pages with 403 error, which is not considered as a serious problem. To review HTTP status list of codes in details, use the help of Google.
- Sitemap. Errors in this block are often caused by an old sitemap, or links with 404 error. Make sure all links on your list to work properly, because Google search robot often refers to it. The one unpleasant thing is that Google continues to crawl the old site maps that were removed from GWT to check that both sitemap and links are really “dead”. Make sure to specify 404 error instead of a redirect to the new content for already removed old sitemaps from the webmaster tool that are not wanted to be scanned. To stop the scanning of URLs that have been discovered, assign a 404 error to the pages. When a robot can’t find the page for a few times, it ceases to scan it. After sitemap is no longer checked, remove it from Sitemaps section in GWT. Sitemap correctness plays a great role in a site indexing, so be sure to make it clear if your purpose is a blog monetization.
- Broken links. Errors of this type are often caused by improperly installed redirects. As Google crawler gets tired of the long chain of redirects, try to minimize the chain of redirections, set minimum redirect timer and do not use “refresher” meta-tag in pages titles. When installing redirects:
- Make sure of the correct status of 301 code, which notifies Google bot about the page removing.
- Make sure there are no so-called redirect loops, that is, when redirection occurs to the starting point.
- Make sure each link is a working page and get rid of 404 as well as 503 and 403 errors.
- Page not found. In most cases, 404 error occurs when:
- You have removed the page from the site, and did not put 301 redirect.
- You forgot to change 301 redirect when changing the site title page.
- You made a typo when specifying internal link.
- Someone linked to your site, but incorrectly typed its address.
- Your site has moved to a new domain, and structure of subdirectories broke down.
Use 404 error for old pages that you would like to remove – it will bring you only benefits. Google recommends the use of 404 error to let crawler know about you intention to get rid of unnecessary pages.
If you have empty pages or pages with a very small amount of content, be aware of that they can be classified as ones with a false 404 error. This classification is not ideal in case you want to remove the page completely and implement real 404 error. If this type of 404 is assigned to the homepage content, then you should fix it.
- Timeout. If a page takes too long time to load, Googlebot suspends its scan and doesn’t index it. Check server logs and download speed to avoid such situations. Types of time-outs:
- DNS timeout – occurs when Googlebot can’t access the server domain. In this case, check the DNS settings. Sometimes it happens that the problem arises from the side of SE – then you’ll have to just to wait.
- URL timeout – the error is related to a single page, not to the entire domain.
- txt timeout – if you have robots.txt file with limited access, Google postpones the site review in order not to provoke the indexing of banned files.
Google periodically updates its webmasters panel, offering an increasingly broad range of tools for improving the quality of sites. Therefore, if you see any differences with this article, or you just have questions about this material, contact us and feel free to write about it in the comments and share your Google Webmaster Tools impressions.