Upcoming LunaMetrics Events
San Francisco, Apr 28-30 Los Angeles, May 12-16 New York City, May 19-23 Chicago, Jun 16-18

3 More SEO Tricks in Google Webmaster Tools

Previously, I’ve written about 3 other ways you can use Google WebmasterTools for SEO – diagnosing 404 errors, examining your inbound links, and seeing how your keywords perform in the SERPs. I’ve also written about ways you can use Bing Webmaster Tools for SEO.

Today, I present you with 3 more ways to use Google Webmaster Tools for your SEO endeavors — xml sitemaps, sniffing out duplicate content, and checking structured data. What can I say? I’m a thrifty dude, and I love a good free SEO tool.

 XML Sitemaps

XML Sitemap main screen in Google Webmaster Tools

 

In my other piece on Google Webmaster Tools, I made the case that GWT is an absolutely essential piece of the SEO toolkit, and that its capabilities for monitoring and diagnosing 404 errors are arguably the biggest reason to start using GWT. One could also make a compelling argument that the XML Sitemap feature is the big reason every SEO and webmaster needs to be using Google Webmaster Tools. An XML Sitemap(s) is something every website should have; it is an opportunity to tell Google and the other search engines what pages on your site you want to be crawled and index. The search engines don’t guarantee it will abide by the Sitemap, but anecdotal evidence has proven time and time again that XML Sitemaps help provide insurance that your pages are found and found faster (especially if the Sitemap(s) dynamically updates your new pages). Sitemaps can get tricky — especially when you have a large site or when you use special Sitemaps for images, video, news, mobile, or source code. To ensure you’re doing your Sitemaps right and getting the most of them, submit them with GWTs Sitemap feature, which can be found either through the dashboard or in the left navigation, under “Optimization.”

It is recommended that you always validate your Sitemaps to make sure they were done right and fully readable by the engines. And what better way to validate than through the eyes of Google? Simply click the big red “Add/Test” button and test away.

XML Sitemap Test Fail in GWT

Once you’ve submitted a valid sitemap to Google, you should not ignore it, however. Check in regularly to see if there are any errors or warnings. Often, a sitemap error will reveal a larger problem with your site. In addition, pay attention to the number of URLs (or images, videos, etc..) indexed versus the number of URLs or items submitted. It is not uncommon for there to be a discrepancy here, but one of your SEO goals is to get the search engines to index everything you want indexed. The tricky part is seeing which pages are not indexed (in fact, this topic could warrant its own article), but this may be possible with tools such as Google site search, Analytics landing page reports, and scrapers like Outwit. If the pages not indexed are important to you, there are a few things you can do to improve indexation. For example, you could add or adjust tags in Sitemap: the <priority> tag tells the search engine how important a URL is, and the <changefreq> tag indicates how frequently the page is updated (for example with links to new pages). Also, unindexed pages may be a red flag that those pages lack inbound links or lack content perceived by engines to be unique.

Sniffing out Duplicate Content

Speaking of content which is not unique, Google Webmaster Tools can help you find duplicate content. As you likely know, it is generally a bad practice to have pages that do not contain content unique to that page. The first step in dealing with duplicate content problems is identifying them, and GWT offers one way of doing so that is too simple to ignore. Simply check for duplicate Title tags and Meta descriptions.

HTML Improvements GWT

To do this, go to the html improvements section, found under “optimization”. Here, GWT will let you know if your Title tags or Meta Descriptions are too long, really short, or are duplicated. This is all helpful for discovering opportunities to improve the effectiveness of your Titles and Metas, but it is the duplication reporting that I often turn to for a quick look at duplicate content red flags. Find out which pages share which Title tags and, if there’s a lot of duplicate Titles, download the data so you can play around with it in excel. You’ll be on your way to identifying duplicated sections of your site in no time.

Checking on Structured Data Markup

Structured Data markup is, like, totally the big thing on the web right now. All the cool kids are doing it and you should be too. Google makes many varieties of rich snippets from microdata, microformats, and RDFa that get displayed in the search results pages like recipes, reviews, and much more. By implementing the right data markup, you can hope to trigger the data on your site to display as these rich snippets, and dramatically improve click-through-rates for a notable bump in search traffic. Well, less than two weeks ago, Google announced it would add a structured data dashboard to Webmaster Tools (also found in the “Optimization” drop-down). This exciting new tool enables you to see everything Google knows about your structured data markup. Frankly, this tool makes me very happy inside.

Rich Snippets Dashboard in GWT

 

By viewing stats on structured data for your site as a whole and by type of data, you can verify that Google is picking up new structured data. If the numbers and data don’t seem to match what you hope to expect, find a page that should be triggering a rich snippet but isn’t and test it on GWT’s handy-dandy Rich Snippets Testing Tool.

GWT rich snippets tool screenshot

In addition to detecting and diagnosing structured data shortcomings, I really recommend just surfing the structured data tool. Get in there and have some fun and explore. I think it’s a great way to help understand how users see your data through the Google filter, and this can really help you come up with ideas for improving the interactive experience for your brand.

There’s a lot of awesome in Google Webmaster Tools just waiting there to make webmasters’ and SEOs’ lives easier, and (even in 2 blog posts) I feel like I’ve only been able to cover a small portion of the GWT SEO tricks. What are some of your favorite SEO tricks on Google Webmaster Tools?

Reid Bandremer

About Reid Bandremer

Reid Bandremer is a Search Analyst. He brings strong analytical abilities, a penchant for strategy, and a robust business background highlighted by an MBA at Robert Morris and experience in eCommerce marketing. Contrary to popular theory, Reid is not homeless – he just likes staying at the office late because he is passionate about increasing organic search traffic to client’s sites.

http://www.lunametrics.com/blog/2012/08/13/seo-google-webmaster-tools-2/

21 Responses to “3 More SEO Tricks in Google Webmaster Tools”

Great post! It would be really helpful though to have a link to larger versions of your image when using screen captures w/ a lot of detail.

Kevin Ekmark says:

I would have to agree about using GWT for the sitemap submission. If you were to do anything at all for your site in GWT, I would say that it is the thing that I personally find the most value out of for monitoring the overall health of a site.

And on a side note… Nothing wrong with being thrifty Reid! :)

Remember the good old days of Yahoo Site Explorer? (sigh)

Reid Bandremer Reid Bandremer says:

Thanks Kevin! We have one vote for Sitemaps as the GWT’s MVP, folks.

Also, have you checked out Bing’s new Link Explorer? It’s like the reincarnation of Yahoo Link Explorer.
http://www.lunametrics.com/blog/2012/07/02/4-ways-bing-webmaster-tools-seo/
http://www.bing.com/community/site_blogs/b/webmaster/archive/2012/06/27/reviewing-link-explorer-and-fetch-as-bingbot.aspx
It’s available in Bing Webmaster Tools:http://www.bing.com/toolbox/webmaster

Happy backlink surfing!

Reid Bandremer Reid Bandremer says:

To “PPC Scotland”:
Dude, seriously????

Reid Bandremer Reid Bandremer says:

Thanks for the catch Annie. Noted and followed up upon.

Eddie says:

I never thought that sitemap is very useful in your Google Webmaster tools and I never check it regularly actually.. Thanks for your information Reid!

Reid Bandremer Reid Bandremer says:

My pleasure Eddie!

i use WMT and analytics to make decision about my new posts and keywords I want to optmize.

Sir, u said something about html improvement section. I go through it and found many errors, most page errors like this. http://www.Example.com/2012/01/page2 or page 13 etc. So, i used robots.txt to Disallow: *page= and
Disallow: *?page=..
Is it the right way to do it.

Thanks

Reid Bandremer Reid Bandremer says:

Sunny,

Sorry I’m just getting back to you now – your question kind of slipped thru the cracks during the holiday.

Anyways, the robots.txt can indeed be used to fight duplicate content by disallowing search engines from crawling certain URLs in the manner you indicated. This is a good article on the semantics: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449&from=40364&rd=1.

There’s 2 important things you need to carefully consider in your case though:
1. You are preventing the search engines from crawling ALL URLs with the containing the string page=. Be sure these are all duplicate pages you do not want crawled at all.
2. Paginated pages are typically similar content but not purely duplicate. Not every URL that has duplicate title tags or Meta descriptions as indicated in the html improvement section is duplicate content that needs to be blocked. Perhaps instead the URLs containing page= simply lack unique title tags or they could be addressed in a different manner. This article might help: http://www.seomoz.org/ugc/mitigating-mixed-signals-effectively-consolidating-paginated-urls. You might decide in the end just to continue to use the robots.txt to exclude those URLs but make sure that is really what you want to do.

Leona says:

With havin so much written content do you ever run into any
issues of plagorism or copyright infringement?
My site has a lot of exclusive content I’ve either written myself or outsourced but it appears a lot of it is popping it up all over the internet without my agreement. Do you know any techniques to help prevent content from being ripped off? I’d
definitely appreciate it.

Reid Bandremer Reid Bandremer says:

Soemtimes our (and our client’s) content gets scraped, but I have yet to have a huge issue where the copied content outranks the originals so I’m not much of an expert on the matter. We do do a bit of monitoring on plagiarism with Copyscape, but again I wouldn’t worry about content being scraped as long as your originals are on top in the rankings. Make sure you have internal links and mentions in your content, and scrapers may actually help by giving you backlinks and brand mentions. In addition, make sure you’re linking to new content on high authority pages to ensure it will outrank the scraped versions during that important period when it is brand new and Google needs to figure out who the originator is. Also, this article is good:http://www.wpbeginner.com/beginners-guide/beginners-guide-to-preventing-blog-content-scraping-in-wordpress/. Good luck!

Chris says:

I had no idea about the Structured Data Markup section – thanks for the tip Reid!

Reid Bandremer Reid Bandremer says:

NP Chris. Thanks for commenting!

Hi are using WordPress for your blog platform? I’m new to the blog world but I’m trying to get
started and set up my own. Do you require any coding expertise to make your own blog?
Any help would be greatly appreciated!

Jenny B. says:

Excellent article, I just wish google adds a detailed list of blocked or problematic pages, after all, they usually profit from all our content pages. J.

Reid Bandremer Reid Bandremer says:

Hi Savan. Well if it says “We didn’t detect any content issues with your site.” then you you don’t have any html issues Google can find. So long as Google can find all your content, that particular message is a good sign.

What was the exact message?

Mathukutty says:

My site is in Drupal and I use xmlsitemap module. This module automatically submit updated sitemap to google and bing. Is it ok or should I submit directly thru webmaster? I could find even after correcting the html improvements found in gwt, old warnings are still showing, When they will update this?

Reid Bandremer Reid Bandremer says:

Hi Mathukutty,

Is it ok or should I submit directly thru webmaster? You definitely should also submit thru webmaster tools – not so much to ensure the Sitemap gets crawled but for the very valuable insight you can gain thru the Webmaster Tools Sitemaps report.

I could find even after correcting the html improvements found in gwt, old warnings are still showing, When they will update this? i can’t provide too much insight without actually working with the site and GWT, but… it is typical for GWT to have a lag of up to a few days on updating many of its reports. In addition, it can take a while for crawling, indexing, and purging outdated pages/content from the index. It really depends on how much the site is crawled – higher traffic sites with more authority tend to get crawled and updated more frequently. My advice is to keep an eye on things closely and make sure you did correct everything – that there is no duplicate content hanging arround cause the warnings to stick.

Hope that helps.

Reid Bandremer Reid Bandremer says:

BEL19VE,

Structured data issues can get pretty technical and complex, so I can only scratch the surface.

One thing you need to do to troubleshoot is to plug pages into the structured data testing tool.

If a type of structure data you want to be picked up does not show up on the data tool, then the code is not there. That would mean you didn’t plug in the data fields in the schema plugin, or the plugin may not be working.
Another scenario is a piece of desired structured data may show up on the tool but display errors.
In a third scenario, if the data shows up in the testing tool with no errors, then it may not be following Google’s required guidelines; and you’ll need to ensure conformance in order for the structured data to be picked up. If this is the scenario, read up on the Google Webmaster Tools Help info corresponding to the rich snippet you’re going for. If you feel you’re doing the markup exactly as Google intended, there’s also a chance it is not displaying because it does not trust the data.

Note that Google is especially finicky on reviews: https://support.google.com/webmasters/answer/146645.

Hope that helps.

BEL19VE says:

Reid,

Thanks for he suggestions. I’ve been using the structure tool and I just saw 3 errors on it:

Error: Missing required field “entry-title”.
Error: Missing required field “updated”.
Error: Missing required hCard “author”.

I’ve been dealing with those the past 24 hours and still can’t resolve them. Found some online tutorials but noting seems to do the trick so far.

Otherwise all the pages are shown well on the structure tool, exactly as they should. however still the webmaster tool does not pick them up.

Will keep trying. Thank Again