Archive for the ‘Miscellaneous’ Category
PART ONEPART TWOPART THREE
Making the Right Choice – Part 3 of 3
In the two previous parts to this three-part series, we discussed the issues facing us as we evaluate potential outdated content, and we investigated options to handle that content. In part 3, we discuss how to pick the right right options.
Matching Option to Scenario(s)
By now, you should have answers to important questions like, “How much effort is this worth?“, “what are my SEO needs”, and “what are my UX issues”?
You can now use the table below, which shows the impact of your options for handling old content on labor costs, SEO, and UX.
PART ONEPART TWOPART THREE
Options for Dealing with Old Content
This post is part of series on how to handle old and outdated content. Part 1 focused on your internal resources and the reasons you may want to update old content. Part 2 focuses on the 6 types of potential options you have for how to update old content, and Part 3 will help you make the right decisions.
As you identify problem pages, whether they’re outdated, incorrect, or no longer relevant, you can also start thinking about the best way to fix these pages. (more…)
In October 2014, the Google Tag Manager team announced a new version of their popular tool, complete with easier workflows, a brighter design, and many other wonderful features. Most things work in a familiar fashion, with a few name changes.
Macros are now called Variables, and the Lookup Table Variable works exactly as we would expect it to. Sadly, there is still no support for CSV upload, so there still exists a need for a tool that people can use to quickly copy and paste from Excel or Google Drive.
I created a clunky workaround for Version 1, and at the request of many, I’ve now created an updated version that works with the new interface. As the GTM team continues to improve the design and functionality of Version 2, this tool could possibly stop working, and could hopefully become unnecessary.
Click to Expand
The year was 2004, and I was unemployed.
So I networked, networked, networked, and then I wanted to thank all those people who took time out to have coffee with me. Often, I heard myself saying, “Why don’t you let me evaluate your web analytics?” Most of those companies had some crummy server-side analytics, and somehow I found insight in all those metrics.
Gradually, I found a business for myself and landed a few gigs. I was an early user of Google Analytics, and I kept trying to figure out how to make this interesting tool more powerful. What is that setVar thing, I kept wondering, and what is so regular about those expressions?
Even though I really didn’t understand why _udn should be equal to none, I still knew more about analytics than I did about SEO. So I hired my first search employee, Taylor Pratt, in 2006. By March 2007, I had written enough about Regular Expressions that Google asked our company – all two of us – to become a Google Analytics Certified Partner. I found out that we had been listed on the Google Analytics Partner page when Sirius/XM called us for GA consulting.
Every discussion about the importance of specialization in marketing comes with a disclaimer: it cannot come with the risk of total tunnel vision. Digital marketers must maintain a broad understanding of each channel in their marketing mix, applying lessons and strategies from one to the others.
This article provides a brief overview of public relations (PR) and the three things that we should all learn from publicists: Personalize, Evolve, and Provide Value. (more…)
Here’s a quick tutorial on how to use Excel to analyze the keywords that have more than one of your site’s pages ranking in Google organic search results.
Your site may have plenty of keywords that have more than one landing page ranking for a variety of reasons. For example, when someone googles “Google Analytics Training”, there are many different LunaMetrics pages that might display, based largely on where the user is located.
Let’s look at how we can break these out and analyze them further. (more…)
Raise your hand if you’ve heard a co-worker say “Ugh, I’ve gotta jump on a call”! Most people don’t look forward to phone calls with clients. There’s the inherent fear that you’re not prepared (It’s hard to imagine the audience in their underpants when you’re only calling one person across the country), or that you don’t have the right report or solution lined up.
If you work in the Search & Analytics fields like we do at our office, it’s quite possible that you have not and will not meet certain clients face-to-face due to distance, so building rapport can be a challenge. You just don’t get to shoot the breeze on the phone like you might during an on-site visit or lunch with your client.
In fact, relationship building is my favorite part of working with clients. Helping them succeed and meet their objectives helps me succeed and meet mine, so I invest in good client relationships wherever I can.
If you don’t share my excitement over client calls, I’ve assembled the following presentation to help you ease any fears when preparing for and executing your next client call.
Did you ever want micro-level geographic information inside Google Analytics? What if you really need “street level” knowledge about your users; like where are they, what neighborhood are they in? Often, when we talk and write about Google Analytics we’re thinking about the big guys. National or even International traffic, filtering by country, comparing one region to another. We’re thinking macro, not micro.
I wrote previously comparing DMA areas to gain insight, but that’s really only helpful if you have a true national or bigger presence. What if you’re just a local Seattle business, and don’t really have much call for looking at traffic outside the Seattle-Tacoma metro area?
Well, first thing you should do is think about taking our Seattle Google Analytics, AdWords, and Tag Manager Training (shameless plug). Second, read on…
Seattle is actually ahead of the game when it comes to data, which is the real reason I’m using them as an example. The city has a Chief Technology Officer, and data.seattle.gov was started in 2010 as a central hub for all local Seattle data. In fact, a number of businesses claimed that the use of this local data helped them with their businesses.
How so? Well, if you’re a local business then the traffic from, and information about, the Queen Anne neighborhood of Seattle might be more important to you than Downtown or Riverview.
But how can you use Google Analytics to help you on this sort of granular level? Also what if you DO care about national level data, but you care about it on a very granular local level as well, maybe looking for interest in your brand to help place billboards, or expand your franchising? The truth is that you can’t, at least not right out of the box. But with a few very easy additions, you can start getting some great local data that can let you make street level decisions about your business in Google Analytics. (more…)
The news that Google plans to drop Authorship photos from the search results was unexpected, but probably not shocking to many SEOs. It will be done in an effort to provide a better mobile experience by decluttering the results, said Google’s John Mueller.
All speculation and mourning aside, we are curious how this affects Google users that have grown accustomed to Authorship photos and Circle information.
In my role here at LunaMetrics I talk a lot about blogs. I love advising our clients on creating engaging blogs using unique, clever content. Our team here at LunaMetrics has been having lots of discussions lately about our own site. Everyone at our company contributes to our blog.
When I came across Matthew Barby’s awesome method for scraping websites to identify link prospects, I immediately wondered what trends I could identify from our own blog using this method. In this post, I’ll examine our blog with third-party tools to extract some actionable insights.
Scraping websites is an awesome way to collect data (provided you’re not violating anyone’s Terms of Service…). In this example I used Screaming Frog to crawl an entire website (Yours!) as well as SEO Tools for Excel to crawl elements of the site’s pages and an API to identify social shares. When it comes to competitor research or building a list of potential press outlets and authors to contact, this technique can’t be beaten because you don’t even need Google Analytics access to amass this data. We’re automatically taking it right from the page. (more…)