How to Use Google Webmaster Tools for SEO



Note: This article was last updated on February 9, 2015. 

Google Webmaster Tools is a sweet suite of Google SEO tools that provides data and configuration control for your site in Google. If you’re doing any SEO and you don’t find value in GWT, you either use a paid tool that re-uses GWT data or you have an untapped gold-mine.

There’s a ton you can do with GWT, but it can take a while to learn how to get great return on the time you spend with it. To that end, I’ve tried my best to assemble a meaty, practical collection of actionable tips on the reports I’ve found most useful.

General Tips

Getting Started
Verify Every Site VersionGoogle Webmaster Tools home screen menu screenshot
Site Messages
Use the Help Files

Search Queries *old report (pre 5-6-15)

Cool Insights

Other Features

Change of Address
Structured Data
Data Highlighter
HTML Improvements
Links to Your Site
Mobile Usability
Index Status
Crawl Errors
Crawl Stats
Fetch as Google
URL Parameters

Getting Started

If you haven’t set up Google Webmaster Tools yet, do so yesterday. It’s really easy and worthwhile. Just go to, sign in with your Google account, and click add a site. Then you’ll be provided with several options to verify that you manage the site. Use the option that’s easiest and make it happen.

Add Every Site Version

The biggest mistake that I see people make with GWT is failing to add every version of every site they manage. It’s unfortunate, because it’s very easy to do. Failure to add every version of every site will result in data for only some of your site(s) ― at best, this stifles insights; at worst, this can cause you to make costly errors or neglect critical issues.

Obviously, if you have the domain and a domain called, you should add both root domains.

You should also add all subdomains. If you have the subdomain and the subdomain, add them both. If you only add, that’s all GWT will track; and that’s all the data you’ll get.

If you have and, add them both. (Then fix your duplicate content issue).

If you have and, add them both.

Basically, if you can change what’s to the left of your root domain and still get a live page when you enter the URL in the browser bar, then add that subdomain. Also add any subdirectories that target specific countries. Google explains the versions you can add here.

Site Messages

Occasionally, Google Webmaster Tools will notify you if your site seems to have a very important issue. Make sure you set up the messaging forwarding so you can get these notifications emailed. The emails may inform you about some problems accessing the site, increases in crawl errors, unnatural link warnings, malware alerts, and more.

One caveat is that there can be a delay between the time the problem arises and the time you get notified. Another caveat is that there are plenty of bad problems GWT won’t notify you on. You definitely want to really pay mind to GWT emails, but they are merely an additional line of defense ― not a replacement for any other risk-mitigation measures.

Use the Help Files

Google has a lot of resources on SEO, and good practitioners gobble as much of it as they can. The Google Webmaster Tools Help Center is a treasure trove.

Make sure you use GWT with an inquisitive mindset. Most of the reports have limits, caveats, and nuances. Before you go rushing off to make a major decision, make sure the data is what you think it is and means what you think it means. Often, GWT data leads to important but unconfirmed hypotheses that you need to investigate.

Also, if you click the help button, you may get a quality, relevant suggested help article. So do that frequently. The articles are pretty good at explaining what the various data-types in reports actually are (insomuch as Google is willing to share).

have heapings helpings of helpful Help

However, the articles are frequently dry and lacking in pragmatic insights on what to focus on. Plus, there’s very little pictures. (We’re expected to just read words on the interwebs? Come on.)

The GWT reports help files can also be a bit inconsistent. I looked through the related help files for each GWT report I cover below, and I’ve listed the most helpful ones.

Be sure to click on those little question marks a bunch too.

Search Queries Reports

Author’s note: On 5/6/15, Google officially rolled out a vastly improved version of this report called the Search Analytics Report (in beta). A detailed help article on the new report is here. Please note that this section of this guide pertains to the old report and is out of date at the moment.

GWT help article here (for old “Search Queries” report)

Search Queries

This is the gem of the “Search Traffic Section” ― heck, it’s the gem of all of Google Webmaster Tools, and the data in this report can be found repackaged in many paid SEO tools suites. This report has some (nothing is truly comprehensive) data on the following:

  • impressions
  • clicks
  • click-thru-rate
  • rankings

That data can be displayed against the following dimensions: 1) keyword, 2) landing page, and 3) keyword/landing page.

You can then filter data by location (but only certain countries) and Google search vertical (regular web, image, mobile, video, or news). And you can download that data and have a field day.

Sweet, right? Well, naturally there are some…

Limitations and Caveats of the Search Queries Report 

This data has gotten extremely popular as folks must compensate for missing data on keyword performance due to keywords (not provided) in Google Analytics. The search query data seems to have gotten much more reliable too. And more specific.

Unfortunately, while Search Queries are one of the most important ways to fill in the (not provided) void, Search Queries data is far from a complete replacement for the (not provided) keywords. Why?

  • There’s no engagement or conversion metrics.
  • You don’t have all rich secondary dimensions like “metro area” or “time of day” like in Analytics.
  • Not every keyword is shown (not even close).
  • click in this report is technically different than a visit (session) in Google Analytics.
  • Historical data only goes back 3 months (a workaround is below).

Additionally, one should be aware of some caveats:

  • The Image search vertical gets many times more impressions than web search, due to many more listings per page. Never analyze CTR or impressions for “All” verticals simultaneously ―always analyze Web, Image, Mobile, or Video separately.
  • I’ve seen instances where multiple listings for a single keyword appears to multiply impressions, driving CTR down unnaturally (or example, having two totally unique listings for a keyword may double impressions, cutting CTR in half.) Sitelinks do not appear to multiply impressions though.
  • The expected CTR varies widely depending on the scenario, so take care in benchmarking.
  • Outliers. All the metrics can be prone to unexpected results. For instances, Avg. position can be massively impacted by uncommon personalization of results. Also, I am unaware if there is any accounting for multiple clicks or views by the same user, or if bots are taken into consideration. That said, the  more clicks a keyword has, the less you need to need to worry about outliers, generally speaking.

New Search Queries report in development― Google announced January 27 that it is working on a new early alpha version of Search Queries. If you want to be a guinea pig, you can request to preview it here.

Cool Insights with Search Queries

Despite the caveats, there’s a ton you can do with Search Queries. Obviously, it’s very good to know which keywords people are Googling to get to your site.

And you probably know how to make use of rankings data. By the way, GWT “avg. position” data has been demonstrated to be relatively consistent with other ranking-checking methods, .

Below are a few other fun insights.

Non-HTML Pages

I find that many, if not most, web marketers have never looked at search engine traffic data on PDFs and other downloads.

Out of the box, Google Analytics can only pull data on HTML pages. Well, one method (here’s more) to get more data on non-HTML pages is the Search Query report. And this is the best way to get keyword-level data on non-HTML pages. Just bust out your ctrl+f  and look for the filetype extension (.pdf, .doc, etc…) in the URL.

Image and video SEO

google image query top pages

It can be useful to look at web-only (regular) queries, image-only queries, and video-only queries. This data can explain weird things you might see in Analytics.

For example, a quick look at image-only queries revealed why we (still) get a lot of low-quality traffic to a random old blog post about naming the then new office pet.


Rest in piece Link Bait

Image and video data can also help you determine the need for and effectiveness of SEO for images and SEO for video.

A very important thing to remember is that clicks do not equal visits ― especially for images. Whenever you compare GWT clicks to images to visits by image in Analytics you get wildly different results. Image clicks will be waaay more than image visits. Below, you’ll see those 4,000+ image clicks resulted in only 132 sessions in our site (these sessions do not include visits to only the image file URL; these are only sessions on HTML pages.)

google image traffic

To find Google image traffic in Analytics:

  1. Go to Acquisition -> All Traffic.
  2. Set the advanced filter to Source Contains Google and Referral Path Matching RegExp images|mgres|imagedetail.

image traffic

AJ Kohn has more details on tracking image search in Analytics.

I haven’t verified 100%, but I’m almost positive that GWT is counting any click on the image SERP, not just clicks to your site. On a related note, a change in the image SERPs in 2013 drastically decreased Google image traffic for everyone.

Another thing to note is that the image data has incredibly high impressions and low CTR compared to the other verticals, so it can really skew your data if you are viewing All search queries.

Mobile vs. Web

Mobile users often do different kinds of searches than non-mobile. For example, mobile users are more likely to be looking for a business near them. Use the Search Query data to get insight on how mobile and non-mobile Google users search differently to wind up on your site.

Another question to ask is “Are rankings are drastically different for the same keyword on mobile vs. non-mobile?” If you rule out that mobile image or video results are not skewing the data, then maybe it’s possible your page ranks lower on mobile. While it’s probably them (Google) and not you, make sure you’re not making any big mobile SEO mistakes.

CTR Analysis

Looking at click-thru rate can reveal a number of opportunities and insights.

First, CTR data will help you understand the relationship between rank position and clicks.

Second, CTRs can also help you understand the SERPs for your niche. Often the CTR is highly dependent on external factors such as competition, number of  advertisers, and amount of specialized results (like rich snippets, local carousels, images, etc…). Understanding which search queries tend to have lower CTR in your niche can help inform your future keyword research and SEO strategy.

Third, CTR may help tell you if your page is what the people are looking for. For example, we ranked #1 for “link bait” despite having vastly inferior backlink metrics to other articles on the topic. It appears our ranking was driven by our CTR being well above average. My theory is that most people Googling “link bait” just want to know what the term means, and that the title of our page seems to users to be most likely to be the straightforward answer.

Fourth, sometimes the organic CTR is something you have some direct control over; and you want to find opportunities to directly improve CTRs. Below-average click-thrus may indicate an opportunity to employ rich snippets or tweak Meta descriptions.

Another important insight is the CTR of searches for your brand. While it will never be near 100%, you usually want to get it as high as it will go. See if you should try to win more real estate in the SERPs.

Finally, combine CTR analysis in both GWT and AdWords to guage total CTR. This can aid in AdWords decisions, such as bidding.

Benchmarking CTR is required for the above analyses. You could look to the varying results of external studies for a figure on average CTR. If you ask me, my take-with-a-grain-of-salt go-to “average CTR” for the #1 position is 30%, but I’m sure you’d get a dozen answers if you asked a dozen SEOs.

You could also take the average CTR for your data. First, export the search query data into Excel. Then isolate a bucket of queries for a given rank (all queries for position 1, for example). Then take the average for the bucket.

A third benchmarking method is simply to note current CTR and aim to improve upon it.

Search Queries Hacks

Integrate GWT into GA

Viewing GWT Search Query data in Google Analytics (GA) is super easy. All it requires is for the admin of both GA and GWT to log into GA and, in the left nav, go to Acquisition -> Search Engine Optimization -> Landing Pages.

If you’ve never connected GA and GWT, you’ll see a screen that states “This report requires Webmaster Tools to be enabled.” Simply click the set-up button and follow the easy instructions.

But there are limits.

One limitation of connecting the accounts is that you can only connect one GWT account to one GA account, and a GWT account can only be for one subdomain. So, if you have multiple subdomains, an individual GA view will only display some of your GWT query data. Another limitation is you can’t view Search Query data by landing page together in GA. These limitations can be overcome by viewing the data directly in GWT.

Export keyword by landing page

Viewing keyword data without landing page data is like having chocolate without the more chocolate. Unfortunately, GWT doesn’t let you download the search query data by landing page without clicking on every landing page in the report.

Well, LunaMetrician Noah has created a great bookmarklet that will automatically “click” on every landing page to reveal the search queries and then download it. So now you can have double chocolate. (OMG.)

Automatic exports
One problem with the Search Queries report is that it only goes back 90 days. That’s no good if you love historical data like I do. The obvious solution is to export it periodically, but this is a pain to constantly do manually. Fortunately, you can automate downloads: here’s a PHP method and a Python method.

Other GWT Reports

Change of Address

change of addressGWT help article here.

Unlike the rest of the reports described below, the Change of Address tool is not located in the left menu. It can be found in the top right. There’s three main things to know:

  1. If you’re changing your domain name, submitting a change of address here with Google is essential (likewise for Bing).
  2. Never, ever submit a Change of Address unless you are actually changing the domain name for your entire site.
  3. Carefully follow all the steps Google gives you on the Change of Address Page. Google’s

(More major migration tips here, btw).

Structured Data

GWT help article here.

Structured Data

Hopefully, you know by now how Google uses markup to inform rich snippets that get displayed in the search results pages as recipes, reviews, and much more. And, that by implementing the right data markup, you can hope to trigger the data on your site to display as these rich snippets and dramatically improve click-through-rates for a notable bump in search traffic.


If structured data matters much to you, the GWT Structured Data report is essential.

By viewing stats on structured data for your site as a whole and by type of data, you can verify that Google is picking up structured data.

You can also get nice details on the individual data pieces being picked up and also on errors.

Structured markup detail

If the numbers and data don’t seem to match what you hope to expect, start diagnosing by looking for errors. Then, find a page that should be triggering a rich snippet but isn’t and test it on GWT’s handy-dandy Rich Snippets Testing Tool.

Data Highlighter

Excellent GWT help articles here. Nice article by Portent here.

The Data Highlighter is a tool that basically tells Google the same things markup would. The Data Highlighter is very user-friendly and can be used to tag at least 9 types of data, and every tag corresponds with markup (for example, using the highlighter for Events is equivalent in Google’s eyes to markup with

I haven’t used the Data Highlighter much myself. Whenever feasible, I prefer getting markup actually coded onto a page’s HTML, because the Data Highlighter is only seen by Google, and does not help Bing, Yahoo, and other search engines. It’s also not as robust as hard-coded and is known for being a little quirky.

That said, you should definitely familiarize yourself with the Highlighter’s supported data types. If hard-coding schema is not practical, take the Data Highlighter for a spin. It’s a great way to win rich snippets with little initial effort without a developer or plugin.

HTML Improvements

The HTML improvements section can not only help you improve the appearance of your SERP listings, but also help you find opportunities to address keyword optimization and duplicate content issues.

HTML improvements

Find title tags and Meta descriptions that need to be fixed.

However, the HTML Improvements report will do a good job flagging pages that don’t conform to the following best practices for Title tags and Meta descriptions:

  • Have a unique one for each page.
  • Don’t make it too long or it will get truncated.
  • Be informative.

Sniff out duplicate content.
As you likely know, it is generally a bad practice to have pages that do not contain content unique to that page. The first step in dealing with duplicate content problems is identifying them, and GWT offers one way of doing so that is too simple to ignore ― simply check for duplicate Title tags and Meta descriptions.

Find out which pages share which Title tags and, if there’s a lot of duplicate Titles, download the data so you can play around with it in Excel. There’s a good chance the URLs are duplicates. While there are other good ways of finding duplicate content (like with Screaming Frog), this method’s benefit is that it will show you some duplicate content Google has indexed.

Caveat on non-indexable content data

I’ve worked on countless sites that have content that appears not to be indexed or even properly read by search engines that is not reflected in the “non-indexable content” data. I really have no idea what a page has to flagged here (I’d love to hear insights if anyone knows). I almost always see GWT say “we didn’t detect any issues with non-indexable content” ― even when it seems that would be incorrect. So use caution.



Google your brand. Now do it on private browsing.  Assuming you have sitelinks, do you like them? Occasionally, the sitelinks can link to pages that convert poorly or offer suboptimal UX. If you don’t like a sitelink, you can demote the Sitelink to reduce the chances of it appearing.

If you have a lot of branded traffic and a crappy sitelink or two, this is a big and easy win. The most common big win scenario I see is when a site was getting a lot of traffic to a page that has suddenly become dated (for example, a seasonal or out-of-stock product).

Just make sure this is the right thing to do. Factor in the possible impact of personalization, location, and device on the sitelinks you observe ― what you see may not be what everyone sees, and what you don’t want to see may be something some people do want to see. Also, if it happens that the vast majority of Google traffic to a page is coming through a sitelink (which you can determine by analyzing the page in the Search Queries and noting how many clicks are from branded queries), then you can guess on conversion and engagement for the sitelink in a Google Analytics landing page report filtered for only Google organic traffic.

Links to Your Site

Links to your site

Links to Your Site give you data on who links to which pages on your site. Since links remain the most important component of the Google’s algorithm, understanding backlinks to your site is important in understanding how to improve your rankings capabilities.

I don’t use Links to Your Site much these days, because paid link data tools have more actionable insights. We use Open Site Explorer by Moz. ahrefs is another link data tool. Majestic SEO is a third option, and has the largest database among the premium link tools.

If you don’t have a paid tool, then GWT is very much worth your while. You also should examine the free limited versions of the three above-mentioned tools. You should check out the Inbound Links report of Bing Webmaster Tools, which I believe has higher limits on how many links it will report (or, at the least, documentation on its limits).

Who links the most

Nobody crawls the web as deep as Google does. GWT might have data on some links when the other tools don’t. That said, GWT doesn’t always display all the links Google knows about (I’m not sure what the quantity cap actually is, but you may be able to get more than 1,000 domains if you download the data). You can download the linking domains and check to see if a given domain is linking to you. This can be useful if you really want to see if a specific site links to you or if you just want to see what the other link tools are missing (typically the dirty underbelly of the interwebs).

You might also note if the quantity of linking domains is growing from month to month.

The data from “download more sample links” or “download latest links” is very noisy; I find I need to scrub out links from the same subdomain in Excel to get any use out of it.

Your most linked content
There’s a solid chance you can use this report to find your inbound-linked-to pages you won’t find elsewhere. Seeing which pages pull in the most links and why is my favorite thing I do when analyzing link-winning strategy. I don’t use GWT for this much, but it can help if you have a site that doesn’t get a ton of backlinks and every little link matters.

How your data is linked
While anchor text isn’t as critical to rankings as it used to be, it’s still worth looking at now and then. Unfortunately, the GWT report only lists up to 200 phrases.

Mobile Usability

GWT help article here.

mobile usability error report

The year of mobile is no longer next year. As you may have heard in 2014, mobile internet usage exceeds that of desktop in the U.S, and mobile is the most popular form of any media worldwide.

In January 2015, it was reported that Google has been sending many mobile usability warnings to webmasters; that article also noted the many signs a new mobile ranking algorithm is incoming (count me among the bandwagon riders who feel mobile UX will be a ranking factor).  Certainly, Google has been making a serious effort to communicate mobile SEO best practices.

This all underscores the importance of the new Mobile Usability report , which Google announced in late October 2014. It lists the following mobile UX issues (links go to Google’s associated lit on best practices):

The report lists URLs that contain a given error. The list does not appear to be comprehensive ― that is, that not every URL is reported ― but there should be more than enough reported errors for diagnostics.

Index Status

GWT help article here.

index status

Index bloat is one of the most common problems SEOs deal with. When Google has way more pages indexed then deserve to be organic landing pages, the consequent dissipation of link juice and constrained crawl budget can have a significant impact on SEO traffic.

The converse of index bloat is when pages that should be indexed are not indexed, and this is an equally important problem. There’s no shortage of horror stories of a site’s organic traffic dying because indexation was blocked via a problem with something like robots.txt, Meta robots, rel=canonical, or nofollow attributes. Often, when these issues are in their early stages, the impact on traffic is not yet apparent.

Check the Advanced Index Status report and examine total pages indexed, the number of pages removed, and the number of pages blocked by robots.txt. If any numbers have moved in a way you wouldn’t expect, investigate immediately.

For more information on crawling and indexation metrics, read this.

Crawl Errors

GWT help article here. (links to articles on specific types of errors on right)


A 404 is the HTTP status code for Page Not Found. This error occurs whenever there is no page for the URL requested. Webmaster Tools reports 404 errors whenever Google’s spider crawls a link to a URL that has no actual page associated with it. Common causes of 404s include typos in the destination URL of a link and failure to redirect the URL of a page that was moved or deleted. Both causes of 404s can be detrimental to both the user experience and your SEO endeavors.

Note that many GWT 404s are outdated, “false alarms”, or triggered by bad links from insignificant pages no one ever visits. These may not represent any significant inconvenience to your users or wastage of link juice, but many 404s will be problematic.  Click on the URL to see the site’s linking if you suspect the 404 may be a problem.

Resolve problem 404s by 301 redirecting to the appropriate page, by changing the destination URL of the inbound link, or by restoring content to the 404, depending on what is most practical and most beneficial to your users.

Note that if you utilize the “MARK AS FIXED” button, you will have more up-to-date data.

This post explains the right perspective on 404s. While the GWT report is super useful for trends and is a great data point, I often look to some other data points like Google Analytic for taking action on 404s (see #9 in SEO Measurement Mistakes Part 3: Crawling and Indexation Metrics).

Soft 404s and Other Crawl Errors

Google Webmaster Tools crawl_errors

404s get a lot of attention, but there are other crawl errors that can impact user-experience and SEO. For example 403s, 500s, and 503s are all non-crawlable. Other “not followed” URLs like redirect loops may not be crawlable. Google Webmaster Tools reports on all these.

Soft 404s are a user-experience and SEO issue, and GWT can be the best way to find them non-manually (though some might not actually be soft 404s).

However, GWT does not report on crawl issues like misplaced meta robots tags or 302 redirects.

Crawl Stats

Crawl stats

The data in the Crawl Stats may not be as rich as server log file data, but it’s better than not looking at any spider activity reports at all.

Crawl Stats has pretty volatile graphs, but do look for big, weird spikes and distinct trends. For example, Crawl Stats can tell you:

  1. If you have increase in # of pages sucking crawl budget ― if pages crawled goes up, but kilobytes downloaded does not,
  2. If page load times suck crawl budget ―  time spent downloading a page goes up and # of pages crawled goes down, or
  3. If crawl budget increases/decreases ― kilobytes downloaded per day will trend, and pages crawled will likely follow.

Fetch as Google

GWT help articles here

old screenshot of Fetch as Google

Ensure Google can read your page

Until October 2014, I didn’t use Fetch as Google nearly as much as I should. Then Google’s Pierre Far explained to me at PubCon that tools like did not reliably show what Google can see with total accurately.

In my opinion, it’s wise to second-guess some Fetch as Google results as I don’t feel it always paints the full picture on SEO readability, but I certainly would not audit a site without Fetch as Google.

Fetch as Google is an essential tool in making sure your pages are SEO-friendly (or at least Google-friendly). I recommend requesting a Fetch and Render on every template you have and every critical SEO landing page.

If a page’s status is not “complete”, then you need to analyze the page to see if all important content is Google-readable. Google has a list of every Fetch & Render status and its description here.

Partial Render by Fetch as Google

Partial Render

The screenshot above shows a page that couldn’t be fully digested by Google due to robots.txt file blocking several scripts. Using Fetch as Google on many sites as showed me just how often this happens. (Incidentally, Pierre Far also told conference attendees that the biggest SEO error he sees is accidentally blocking Google from crawling all of your website.)

GWT’s robots.txt tester can be used to see if a URI is blocked by robots.txt. However, I prefer this robots.txt tester, because you can analyze multiple URLs.

Submit URLs for indexation

submit to index

The other use of Fetch as Google is to tell Google you want them to crawl a URL and include it in its’ index of URLs eligible for inclusion in search engine results pages.

When you hit “Submit to index”, Google gives you an option to ask it to crawl just the page submitted or that page and all the links on that page.

Fetch as Google is no a replacement for best practices for crawl-friendliness (like good robots.txt, minimal duplicate content, ping plugins,  Sitemaps, and good internal linking.)

However, Fetch should often be used for site upgrades, URL migrations, breaking important news, and launching batches of new content.

Note that hitting “Submit to index” does not guarantee a URL will get indexed, but does help get content in the SERPs faster.




These are the kind of Sitemap numbers you look into.

An XML Sitemap(s) is an opportunity to tell Google and the other search engines what pages on your site you want to be crawled and indexed. For large site or sites with frequently updated content, a Sitemap is pretty important. The search engines don’t guarantee it will abide by the Sitemap, but anecdotal evidence has proven time and time again that XML Sitemaps help increase the chance your pages are found and found fast (especially if the Sitemap is up-to-date and “clean”).

Sitemaps can get tricky — especially when you have a large site or when you use special Sitemaps for images, video, news, mobile, or source code. To ensure you’re doing your Sitemaps right and getting the most of them, always submit them with GWT’s Sitemap feature.

It is recommended that you always validate your Sitemaps before going live. And what better way to validate than through the eyes of Google? Simply click the big red “Add/Test” button and test away.

Once you’ve submitted a valid sitemap to Google, you should not ignore it, however.

XML Sitemap Test Fail in GWT

Check in regularly to see if there are any errors or warnings. Often, a sitemap error will reveal a larger problem with your site. Here is the list of possible Sitemap errors.

In addition, pay attention to the number of URLs (or images, videos, etc..) indexed versus the number of URLs or items submitted. It is not uncommon for there to be a discrepancy here, but one of your SEO goals is to get the search engines to index everything you want indexed.

The tricky part is seeing which pages are not indexed (in fact, this topic could warrant its own article), but this may be possible with Google site search and Analytics landing page reports.  It’s very time consuming manually, but can be automated with technical hacks.

If the pages not indexed are important to you, there are a few things you can do to improve indexation. For example, you could add or adjust tags in Sitemap: the <priority> tag tells the search engine how important a URL is, and the <changefreq> tag indicates how frequently the page is updated (for example with links to new pages). Also, unindexed pages may be a red flag that those pages lack inbound links or lack content perceived by engines to be unique.

On a related note, I wrote about building Sitemaps here.

URL Parameters

GWT help article here.


When you first go into this report, you’ll see a doorway page that states “Use this feature only if you’re sure how parameters work. Incorrectly excluding URLs could result in many pages disappearing from search.” Heed that warning and don’t change the parameter settings unless you know what you’re doing.

But even if you kinda don’t, there’s some useful data in here.

When I’m doing an SEO audit, I like to look at the most commonly used parameters, see h0w the site is using them, see if they’re tied to un-informative URL names, and see if any are causing duplicate content or being a major drag on crawl budget. Don’t worry about utm parameters, which are related to Google Analytics and well understood by Google.

One way to find the URLs with the parameter is to Google “inurl:?yourparameter=”. This will give you an indication of which parameters are getting indexed. The other method is to look for the parameters in the results of a site crawl.

This analysis may lead you to identifying non-canonical URLs; if so, you’ll want to apply duplicate content fixes. You can also configure the parameters in GWT, but this typically should only be a band-aid instead of a permanent fix, and (as noted) should always be done with much caution.




This guide was last updated at noon February 9 2015. The guide was originally published in March of 2014. Sections with notable updates since then include Links to Your Site and Search Queries. New sections include Change of Address, Mobile Usability, and Fetch as Google.

I really hope this Webmaster Tools guide is useful to you. GWT really is incredibly powerful and underutilized. My favorite reports are Index Status, Sitemaps, Fetch as Google, and ― of course ― Search Queries. What are yours?

Are there any reports or functions of GWT you’d like me to cover? Got a hot Webmaster Tools tip you want to share? Leave a comment. We’ll continue try to update and improve this guide in the future.

Reid Bandremer is a former LunaMetrician and contributor to our blog.

  • Just a terrific post on what is, surprisingly to me, still an underused resource for solid technical SEO. Exhaustive, insightful, and useful. Many thanks, Reid. I will be passing this one around.

    • Reid Bandremer

      Thanks Peter. Really glad you like it.

  • Hi, I have problem with webmaster. It seem my sitemap already submitted 84 post, but the indexed post only 50. I always trying to resubmit, but the indexed still at 50. How can I make it indexed to 84?

    • Reid Bandremer

      There’s no one-sized fits all solution.

      Do note that resubmitting is not going to help. The big factors is if every URL in the Sitemap is worth indexing and you have earned sufficient crawl budget (search engines crawl sites with higher Authority/PageRank more). Specific things to consider:

      • Did you submit multiple Sitemaps? What’s getting indexed more and less?
      • Are you including pages that have a lot of similar content (like blog tag pages, archive pages, scraped content, etc…) These are less likely to get indexed.
      • Are you including only canonical, status code 200 URLs? Redirects, 404s, and duplicate URLs all won’t be indexed.
      • Are your posts unique enough? Are there other posts out there like yours? (You can test via Copyscape).

      Also, indexing can take time, especially in a less established site.good luck.

  • Just thought it should be mentioned that if you want to export data from GWT, it’s not actually the Webmaster API that is being used; it is a web-interface hack that is promoted by Google (your Python link).

    Also, if you don’t want to struggle through the Python or PHP programming, has an Excel add-in that can do it for you. I think has also recently announced their Google Analytics product can download GWT data as well.

    • Reid Bandremer

      Thanks for pointing that out Mike. I updated that section in the post.

      Also, we’ll be checking out those tools.

  • Hello Reid
    Great post. I use GWT and get some useful info from it. I have one matrix i can not understand its the av position. all looks Ok but one of my keywords “grandfather clock repair” is at 200 and dropping. Recently I embarked ob a program to improve my web site organic ranking including adwords campaign and web site improvements to keep customers on my page longer (currently about 53sec). All my matrix are improving exept the most important “grandfather clock repair at av 200 and dropping. I might add that when this keyword was at 180 I could find my entry at page 18 now at 200 it is nowhere to be seen. Thanks for your time in reading this post.


  • Hey Reid, since you dug so deeply into Webmaster Tools, thought I’d share a recent discovery: a nofollow link from LunaMetrics blog comment of mine translated into ranking and impressions numbers for me…but it wasn’t my site in the search results — LunaMetrics was!

  • Thanks Reid for this comprehensive and a very very long article :p but really informative

    if you don’t mind i like to ask something about GWT. In most my articles i put my “ultimate” keyword. But according to GWT report, google not indexing this keyword. How this could possibly happened? i already made the keyword as H3 on every pages (maybe this is the mistake?)

    thanks before

    • Reid Bandremer


      Sorry I’m just getting back.

      Were you talking about the “content Keywords” Report? If so, my guess is that it just isn’t showing you all the keywords it has indexed. I looked at that report for several different sites, and it never shows more than 200 Keywords.

      Let me know if that answers your question or if you were talking about a different report.

      Also, the use of h3 tags doesn’t help (or hurt) much, and I’m wondering if you’re engaging in “keyword cannibalism” – this can happen if you’re targeting the keyword heavily on every page . This won’t stop your ultimate keyword from being indexed but it can stifle rankings in general.

  • Useful article. In meantime, I have a problem with url parameters as my started to appear in Google instead of the root domain only ( for several keywords. Do you think URL Parameters in GWT will solve my issue? Will that link juice still flow even if I configure redirect (parameter) to be ignored?
    Thanks, Darko

  • Robert

    Thank you very much. How do I add location which is UK, but my hosting service is in U.S.A. How can I do that ? I want to respect the rules.
    Thank you very much and have a great weekend 🙂

    • Reid Bandremer

      Hi Robert,

      I’m not sure exactly what you’re asking, but you can Set the geographic target wherever you want, regardless of where you host as long as you have a generic top-level domain (like .com) as opposed to a cc tld (like .us or .ru). See Just be wary that setting a geographic target for U.K. could hurt traffic from other speaking countries (like the USA) – if you’re targeting multiple countries, you shouldn’t set the geographic target.

  • Hi Robert thanks you very much for detailed article on webmaster tool i have one question how many time in a week or month we can fetch one (same) url in google is any help in ranking?

    • Reid Bandremer


      Can the Fetch tool help with ranking? Sort of – it can help pages get crawled, and if the content wasn’t already crawled, then it can help. But assuming the content has been crawled, then the Fetch tool doesn’t help with rankings at all.

      I rarely use fetch to submit to index myself – for the sites I work on (which get crawled pretty well), manual submissions are often too time consuming to be worth the effort.

      However, for sites that aren’t getting crawled – maybe a site with little PageRank/link-equity/authority (which means less crawl budget), a brand new site, or a site that had some some crawl blockage recently fixed – using the submit to index may help. Keep in mind it won’t guarantee indexation, it is just a strong hint to Google that you want the URL crawled – if the URL hasn’t been crawled yet, then that’s the first step to getting traffic for that URL. Note that if you have issues getting pages indexed, it’s best to solve the root issue first, and follow SEO best practices, then maybe use the Fetch tool to speed things along.

      More at (btw, I’m not sure how many times you can submit the same URL – just that “You can submit up to 500 URLs a week” – I’d guess you could submit the same URL 500 times, but there’s no reason to.

  • Hi,
    Found your article and thought it was interesting and easy to read.
    I have a question: what do you mean by:
    “One very important thing to remember when looking at image data is that clicks do not equal visits ― especially for images”? Sorry but I didn’t find to many references on this – the “especially for images” part I mean!

  • Nelson

    Great post — thanks for laying it all out so clearly.

    Noah’s bookmarklet is extremely useful, and worked perfectly for me. Double chocolate, indeed.

    It’s enormously helpful to be able to see KW clicks & impressions tied back to specific landing pages, and be able to download them all so easily.

    One question concerning this: Why do the total clicks and impressions in the downloaded .tsv not match the totals I see in GWT Top Landing Pages view? I apologize if this has been answered elsewhere… I feel like I’ve seen it and just can’t find it now.

    Thanks again — really excellent resource you have put together.

  • Hey Nelson,

    I’m going to jump in here. Glad you found it so helpful! I checked the bookmarklet and I’m still getting matching numbers (in the downloaded TSV and in GWT).

    Could you provide some more information on what exactly does not match for you? Are the numbers shifted? Are they close? Seemingly randomly different?



  • Nelson

    Hi Noah,
    Thanks very much for your response, and bear with me if I miss something obvious (as I am not an expert GWT user 😉

    I retraced my steps, and I think the issue has to do with how the TSV file is handling the data in my Top Pages Report.

    As an example, I’m linking below to a screenshot of the data for a page that did not get many visits this month — 42 impressions and 12 clicks total. When expanded to see the KW data, the impressions and clicks only add up to 17 and 2, respectively. And it’s these numbers the TSV seems to be downloading when I click on the bookmarklet.

    Is there some reason why the traffic data for all KWs that brought visits to a landing page should not add up to the total listed next to the url in the Top Pages report?

    In case it helps, here is the info I am seeing in my the Top Pages report:
    306,488 impressions (displaying 141,783) | 6,654 clicks (displaying 6,654)
    The dowloaded TSV shows 72,055 impressions and 3,387 clicks.

    Your advice is much appreciated!
    – Nelson

    • Reid Bandremer

      Hey Nelson. Noah is our top expert on this so he’ll confirm, but I just wanted to chime in and say:

      • glad you find this work helpful
      • the comment got stuck waiting for approval – wordpress decided not to auto-approve for some reason
      • The sum of the reported clicks rarely adds up to the real # of total clicks – certainly never in high-traffic sites. This is because GWT uses data sampling and not every click on every keyword will be reported. I think (I’m not sure) it’s mostly due to a daily cap on the # of keywords that can be reported (I think GWT only reports cumulative data from the top 2,000 keywords each day by # of impressions (again, I’m not sure). So I think the knowledge gap is the worst for keywords that get few clicks/impressions on average. Noah goes over the nitty gritty on how to get a little more keyword click data at
    • Reid Bandremer

      Looking more into this – I really remain uncertain on the details of reported search query data. I’ve seen the # 2,000 pop up in a bunch of literature, but I can’t seem to get my own data to prove they have a daily cap on 2,000 keywords. So maybe the data is a representative sample of the total population or maybe it’s skewed to be accurate for high-impression(or click) keywords – I don’t know right now (but am hungry to figure it out).

      I just know they don’t show all the data. If you look at the fine print under the total impressions and clicks, they say how much clicks and impressions they display – we just don’t know exactly what they display.

  • Nelson


    I posted a reply with some more info, but it seems to have disappeared. Let me know if you’d like me to repost.


  • Nelson

    Oops, sorry — it’s there now.

  • Thanks Reid,

    That’s exactly what’s going on. Spot-on explanation. And just to clarify further, the Bookmarklet mirrors what is accessible in GWT. So, 2000 keyword limit means many keywords are omitted in the top pages report. As Reid mentions, according to site size, this keyword data may change often from day-to-day. And some pages that had less-popular search queries may display less keywords in the report.

  • Nelson

    Perfect — and thank you both very much for taking the time to explain.

    I imagine this might be why it’s worth automating a download of GWT info daily, rather than monthly.

    Thanks again,

  • Hi, great blog. This GWT is hard work for the uninitiated. What does it mean when my GWT keywords show words like “inc” or “msgid” or “core” or “plugins”? Thanks for your help.

    • Reid Bandremer

      Hi Jane.

      I’m not sure without more info. If you are seeing unexpected keywords in your GWT report, it could mean something unexpected is going on with your site. I briefly did a Google site: search for those words – for example, “msgid” – and I saw some weird pages in Google’s index. Perhaps you have pages being indexed you are unaware of?

  • Jane

    Hi Reid,

    Thanks for the reply. We were just told that a malware opened our directory permissions on all our folders, and pages that should not have been, were indexed. How frustrating. Have you heard of this?

    • Reid Bandremer

      I don’t think I’ve heard of that one Jane. There’s no shortage of various hacks and malware, that’s for sure. Good luck.

  • Jim

    Hi there. Great article and definitely a plus to have as a reference for anyone doing SEO. Thank you so much for putting so much time into it.

    I have an oddball question. If you don’t want your site to be found by Google for SEO purposes, does having a GWT account make that impossible?

    Here’s the situation. I want to promote a product using Adwords, and eventually I will be building a brand site for this product and others. The site I DON’T want indexed is a very direct sale site that really isn’t intended for anything other than traffic that I direct there from Adwords, Radio, Print and Outdoor campaigns. I don’t want Google to think the Direct Response site (which has the word “buy” in front of the brand name) is the Official Site of the brand. Thus we have no indexed it and also no follow in the code and robots.txt file.

    Once we have a few other resources completed (such as clinical trials and attorney’s language for the site ok’ed we will be putting up the “official brand site” and for that we want to get as much organic/seo’d traffic as possible, but that won’t be for several months, and we want to move product now through a Direct Response strategy.

    We are using GA for metrics, but need to know if establishing a GWT account for this site will automatically make it so Google will index the site?

    Thanks in advance.

    • Reid Bandremer

      Hey Jim,

      Missed your question till today. I don’t think (though I can’t verify) just setting up GWT for a site will impact whether or not you get indexed. There are specific features within Google Webmaster Tools that can impact this however. But certainly, GAT will let you know if it’s getting indexed. And Google Analytics will let you know if it’s got enough Google organic visibility to send Google organic traffic.

  • Hi Reid!

    Thanks for the comprehensive article!

    I’m wondering if you ever find GWT’s ranking data to be VERY OFF on any occasion?

    I’ve been looking at my Google ranking closely, using GWT as well as paid tools (AuthorityLabs, Brightlocal, etc). GWT’s data is very different, and as far as I can tell, very inaccurate.

    For example, for the keyword “black bow tie”, GWT says my average position is 5, with a total of 4 impressions the past week.

    There are many things wrong with this picture. First, “black bow tie” has a global search count of 12,100/month so to be ranked #5 and receive 4 impressions in a week, that’s impossible. Second, I’m not ranked #5 for this keyword – not locally nor globally; not in images nor ads.

    It’s hard to trust the rest of GWT’s data, as much as I want to use it.

    Your thoughts on this appreciated!

    • Reid Bandremer

      Hi Tanya,

      Sometimes some of the search query data looks pretty whacky. 4 impressions is a very low amount, and with low numbers like that, the data could easily be skewed by weird outliers. Perhaps one person triggered your result somehow a few times based on personalization from IP address/geolocation, their google+ network, Google Now, or search history. Or someone used some kind of whacky advanced search that triggered your result. Who knows?

      But it’s generally pretty accurate. I’d look at larger data sets before dismissing all the data.

  • Hi Reid,

    I’m sure every word of your article was great but I got lost at about the fourth paragraph! I could really do with someone taking a look at our site and advising me of improvements that should be made to increase traffic and user engagement.

    Can you have someone contact me and give an outline of the likely cost?



  • Hi Reid,
    thanks for the very informative article.
    I have one question about how often is the number of incoming links displayed on GWT updated? Is there any way to influence it?

    • Reid Bandremer


      Short answer(guess) – seems like several times per month.

      I can’t give you a confident answer to either. I never consistently checked GWT links more than monthly – so I can only be sure that they are regularly updated at least once a month – though it would appear they’re probably usually updated multiple times per month. I do know there’s been complaints on accuracy and freshness in the past, but they have been improving GWT and I haven’t seen complaints this year (not that I’ve looked super hard).

      I’m fairly sure there’s no way to influence # of reported incoming links (“Total links” other than winning more actual links.

      P.S. Where you talking about the # of links you can download? If so, there’s these articles on seeing more links – and Basically, you can create multiple GWT accounts to be able to export more total links.

  • Hi Reid,
    Thanks for this terrific and awesome post on webmaster tools.
    I had been a quite follower of your posts to learn more from you.
    I have a question – Whats the normal CTR for any website when it comes to search impressions and clicks?


    • Reid Bandremer


      Thanks. I can’t give a simple answer. CTR depends on many factors. Non-controllable factors include your ad competition, type of SERP (traditional or blended with local carosel, images, news etc…), purpose of search (branded, casual vs. important, intent), mobile vs. destop, competitor rich snippets, competitor organic listing attractiveness. Controllable factors include: ranking and listing attractiveness (rich snippets, title tag, meta description, sitelinks).

      That said benchmarking – though complicated – can still be useful. This is a great article on a new ctr study – also note discussion of some of the ctr factors and the table on previous ctr studies.

  • Hello Reid,
    Thanks for this post, Just wanted to know my website is getting CTR of near 5% from the search engines.
    How could i improve the CTR ?


    • Reid Bandremer


      I’ll refer you to my comment to Amaltas. Controllable CTR factors for an individual page include rich snippets, sitelinks, title tag, and meta description. Note also the impact high rankings and of having the content users are truly searching for.

  • Kalu Charan Parida

    Hello Reid,
    I must say this is a very helpful post.
    It is really great help if you could help me finding the solution of the below problem.

    I see sudden increase of blocking urls in GWT for my ecommerce website. So when I checked URL parameters I can see one parameter is monitored by 3000 by Google & this parameter is associated one of other parameter which is defined in my Robots.txt. So does this is the reason of increase in blocking urls?

    So how to handle this parameter and how to decrease the blocking of the urls to normal positions.

    • Reid Bandremer

      Hi Kalu.

      Thanks. If I understand your question properly, then I do not have enough information to answer. If the # of URLs “blocked by robots” in index status has increased, then removing the line blocking the URLs will indeed decrease the blocking of URLs.

      However, I don’t know if this is the best action to take. Are you blocking URLs that you want to receive traffic too (overblocking)? Or are you blocking URLs that should be blocked (underblocking)? Underblocking is probably more common, but overblocking is more dangerous and can totally destroy SEO.

      The best first step is to figure out exactly what is being blocked.

      Use the GWT robots.txt tester ; try different URLs to see what gets blocked, including URLs with parameter you mentioned.

      You might also want to use the fetch and render in GWT’s Fetch as Google tool to find any resources that are blocked inside a given page.

      You can also paste batches of URLs into to see if of those URLs are blocked.
      To get a really big list of URLs blocked by robots.txt, you can crawl the site with Screaming Frog. (To do this, you’ll need the paid version. Run two crawls; one that obeys robots.txt and one that ignore robots.txt, then compare the results.)

      Also, be sure to understand how robots.txt works. is a nice starter. has a great table near the bottom of “example path matches”.

      Good luck.

  • Hi Reid,

    This is very useful, you almost covered all important parts of Webmasters.

    Reid, which software do you suggest to check duplicate content? Screaming frog doesn’t show duplicate content.

    thank you, looking forward to hearing from you.

    • Reid Bandremer

      Screaming Frog can be used to find duplicate content (just takes a bit of extra work): the link I provided shows how to find duplicate titles – you can also look at duplicate hashes (which, although more rare, never is a false alarm), and duplicate Meta Descriptions to flag potential duplicate content.

      For diagnostics, I also use the GWT method I described frequently I also use Copyscape, especially for finding duplicate content on other sites.

      When I initially audit a site I use all these tools plus I do a lot of manual checks. There’s no substitute for knowing the common causes of and effects of duplicate content and investigating those (t.

      I also use Moz campaigns (paid, but flags as duplicate content or not duplicate content), mainly for monitoring changes in duplicate content status.

  • Yet, there are grants that are made for equipment and training.
    Further, amount of loan must be need-based, subject to ceiling of Rs 25,000 per
    borrower for purchase of machinery or equipment etc, and meeting working capital requirement
    of one operating cycle. Typically, this works well if we are involved in a solid networking and we
    reciprocate, by referring clients to those other businesses.

  • Thanks for sharing the great article, very much usefull

  • Ace Advertising

    Hi Reid ,

    I’ve read your post about submitting our “valid” XML sitemap and your instructions on how to submit to google.
    I have tried your ideas 3 to 4 times with Google , we waited and waited them index our 636 URLs . BUt Google never indexes them.What can you suggest ?

    • Reid Bandremer

      Hi Ace,

      First, apologies for the delayed response – we had switched our commenting platform and I didn’t see this until now.

      Do note that an XML Sitemap only helps get the submitted URLs crawled; beyond helping Google find your URLs, the Sitemap has no further impact on whether or not the URLs are indexed. Once the submitted URLs are crawled, there could many reasons why Google might decide not index some URLs in a Sitemap that Webmaster Tools accepts (even when no Sitemap errors are shown). Below are 4 big reasons:

      • the non-indexed URLs are duplicate content
      • the non-indexed URLs unique content is not search engine readable, and Google thinks it is duplicate content
      • you are blocking indexation, perhaps with a meta noindex tag:
      • Your Sitemap is “dirty” – i.e., the submitted URLs are not status code 200 URLs: “”

        Hope that helps. Feel free to let me know what you find out.


    • Hi Ace,

      First, apologies for the delayed response – we had switched our commenting platform and I didn’t see this until now.

      Do note that an XML Sitemap only helps get the submitted URLs crawled; beyond helping Google find your URLs, the Sitemap has no further impact on whether or not the URLs are indexed. Once the submitted URLs are crawled, there could many reasons why Google might decide not index some URLs in a Sitemap that Webmaster Tools accepts (even when no Sitemap errorsare shown). Below are 4 big reasons:

      the non-indexed URLs are duplicate content –

      the non-indexed URLs unique content is not search engine readable, and Google thinks it is duplicate content

      you are blocking indexation, perhaps with a meta noindex tag:

      Your Sitemap is “dirty” – i.e., the submitted URLs are not status code 200 URLs: “”

      Hope that helps. Feel free to let me know what you find out.

  • John Smith

    Hi, I have a Thailand website. When I download keyword data from Webmaster tools I can select a location
    filter. If I filter traffic from everywhere vs. from Thailand only, impressions/clicks can be significantly different. Why is the reason for this? Which data is more relevant to analyze, the filtered or non-filtered? BTW, with the country filter it shows that 25% of traffic is coming from outside of Thailand, is this normal as a benchmark?

    • John,

      Short answer = it’s all relative.

      The Country filter shows how much Google organic traffic comes from an individual country.
      Which data is more relevant to analyze, the filtered or non-filtered? That totally depends on your marketing objectives. If your target audience is only in Thailand, then filtered is more relevant. If you also care about traffic coming from a few other places, than non-filtered is more relevant.

      I don’t have an answer to “with the country filter it shows that 25% of traffic is coming from outside of Thailand, is this normal as a benchmark?” because I have never worked on a website targeting Thailand (plus I don’t know anything about your site) – “normal” is completely relative. I can say that it is not uncommon to have a significant amount of traffic from outside your target geographic region. For example, that report shows 34% of LunaMetrics’s traffic to be from the United States, though this is of course not a relevant benchmark for your site.

  • gold4 rsgame

    Welcome to, we appreciate your kindly support and trust all this time. Here you can Buy FIFA 15 Coins, Cheap FIFA 15 Coins, FIFA 15 Ultimate team coins at the cheapest price. We have complete hundreds of orders every day and we are sure you buy FIFA 15 Coins here have made your smart choice! All FIFA 15 Coins are full in stock, So we have the advantages of the cheapest price and fast delivery, Buy FIFA Coins take 5-10mins as trusted sellers, we wouldn’t let you guys down . Choose us, we will give you the best quality service! If you people love to play games you should visit once… to buy fifa coins games

  • Very nice article, can we use it in our blog as a source? You can find our blog here

    • Thanks. Of course – I’m never one to turn down a link or mention.

  • One of the best google webmaster tutorial I can say.

  • Cricket Games Welcome to, we have all the free Cricket Games inspired by the great game of Cricket. play Cricket Games now

  • Mullapati Kavitha

    A CPA
    networks that blocks a visitor from entering your web page unless they fill out
    a form. When a person fills out the information the page is then released and
    the viewer can proceed to that page. It could be something like a survey, email
    lead or zip code or what ever. Once they do you GET PAID a few bucks as an
    affiliate. Now you might think why in the world would I want to do that to my
    visitors? You might think that this would send your guests packing fast and it
    could if you do this all wrong. CPA Training What you must
    understand is that a content gateway CPA is not something you want to just do
    on your home page. You use it when you are offering some more important content
    or a free download to a book or software or to keep a free membership site
    free. That way you can earn some cash while your visitors can get access for
    nothing. Just make sure you own the content or have the rights to show the
    material to others.

Contact Us.

Follow Us



We'll get back to you
in ONE business day.
Our Locations
THE FOUNDRY [map] LunaMetrics

24 S. 18th Street
Suite 100

Pittsburgh, PA 15203


4115 N. Ravenswood
Suite 101
Chicago, IL 60613


2100 Manchester Rd.
Building C, Suite 1750
Wheaton, IL 60187