Google Webmaster Tool: Newbie Guide

Of all the free tools that are available by Google, the one that is most useful in managing your company’s SEO strategic plan, is Google Webmaster Tools! Learn the basics to start implementing it into your company!

As stated by its name, this is a set of tools created by Google, to help the admins (or webmasters) of various sites, to make their pages easier to find by the various search engines.

Google aims to constantly develop and improve its Webmaster Tools, giving the admins a clear picture of the SEO status of their sites by the use of metrics, which in turn will help them find and improve any technical issues the site may be facing, and may be causing their site to not be found easily by the search engines, and what steps they can take to improve. Start from the basics and get to know the tool in 3 simple steps!

Website Verification

Before gaining access to the tool, you must complete the website verification procedure, so Google can verify that you are the legal owner of the website, and thus, giving you access to information and tools which can help improve its success rate.

You can verify your website by choosing one of the following methods:

  1. Sign in to your domain name supplier, who may already be working with Google
  2. Upload the html file given by Webmaster Tools to the root folder of your website
  3. Add the Meta tag, which you can find in Webmaster Tools, to your main page
  4. Connect with Google Analytics and finish the process from there, if you have it installed on your website
  5. Verify you are the legal owner of the site with Google Tag Manager

Once your page has been verified, you will see it appear on the Dashboard and on all the reports.

Register the Sitemap – Scan for errors

Under the tab ‘Scan-Sitemap’, you can upload and register the sitemap file of your website. This will help Google’s ‘robots’ (bots, crawlers, spiders) find the contents of your site much easier and faster, allowing them to serve it to Google’s organic searches. Also, it allows them to get a better understanding of your site’s structure, and how all the pages on your site are interconnected.

Use of robots.txt to manage crawling

Robots.txt is a .txt file which explains to the search engines how to navigate or "crawl" in the website. By using this file, you can set the sitemap of your site to lead the search engines to the content you want, to set the pages or content you don’t want the search engines to find, and even stop any search engine you choose from having access to your site. The use of robots.txt can be a great help if your website contains folders or pages that you do not want to be made publicly available, as you may be testing something new on the site, which you don’t want the public to see, or some information on your site is confidential and not for the public.

Digital Marketing Strategy Guide - Free Ebook

Related posts

Search The #Hashtag Story
DISCOVER THE FUTURE OF WEB DESIGN Search