SEM

Google Analytics Tracking Code from TransUnion News pages

Google Analytics Premium Tracking Code from TransUnion News section

Background: Web analytics is a technically sophisticated tool used by marketers to gather, calculate and display advanced statistical data. Businesses depend on accurate data from web analytics to make informed business decisions about interactive marketing initiatives and to measure ROI. For web analytics to collect data and report accurately, analytics tracking code must be installed in each page and the site architecture must support accurate reporting.

To accurately measure traffic from organic search engines, analytics tracking code must be properly implemented in pages seen by users in search results.

Experiment: To determine whether or not Google's new Analytics Premium tracking code is properly implemented at URLs indexed in search results.

Procedure: Query using Google's advanced 'site:' operator for both Google Analytics Premium test case sites. Inspect the first 25 URLs shown in search results to be certain the contain functional analytics tracking code.

Results: TransUnion.com
- Two of TransUnion.com's top three URLs currently shown to users in Google search results don't have Google Analytics implemented properly for accurate reporting. One of the top three results uses a 302 redirect to a page tracked by Omniture. Other pages with improperly implemented analytics for tracking organic search traffic include: TransUnion's Newsroom and child pages, approximately 400 pages in all (ie News Releases, Press Kit, Social Media Releases...).

Results: Travelocity.com
- Two of Travelocity.com's top three URLs being seen right now by users in Google search results don't have Google Analytics implemented properly. Other URLs indexed in Google search results without properly implemented analytics for tracking search traffic include: Travel for Good, Green Hotel Directory as well as child pages approximately 50 in all.

Conclusion: Google Analytics Premium is not implemented properly to track organic search traffic. In order to track organic search traffic, analytics must be implemented in pages indexed in search results and must be funtional. No matter which analytics package you use, how much it costs or who installs it, be sure analytics tracking code is properly implemented in pages seen by users in organic search results.

To ensure analytics is properly implemented in pages indexed in search results, cross reference URLs from Google Webmaster Tools with URLs reported for Google in analytics.

One of the biggest Super Bowl XLV TV commercial fumbles in terms of search engine marketing, was GoDaddy's ad for GoDaddy.co featuring Joan Rivers and Jillian Michaels. It took several minutes for the URL to even appear in Google search results. This delay was probably due in part to the JavaScript redirect employed. Once indexed, the URL provided no meta description and as a result appeared without a snippet. Snippets in search results help users and increase click through rates. To add insult to injury, when Godaddy.co finally appeared in search results it did so directly under a competitor's ad with no GoDaddy ad in sight.

Over the holidays, Google rolled out a pretty major update to Webmaster Tools. This latest update provides much more detail in terms of data and reporting. So much in fact, that some folks seem confused now about the difference between Google Webmaster Tools and Google Analytics. The big difference for SEO is that Google Webmaster Tools shows Google's own data for URLs in Google SERPs and doesn't track web pages like Google Analytics. In addition to the key difference in reporting, Google Webmaster Tools requires no installation. While it's difficult to say for sure, this update should force folks to abandon the ignorance is bliss mentality when it comes to analytics reporting once and for all.

BUT before diving in, here is a little background...

In 2005, with a little help from a Googler named Vanessa Fox, Google launched Google Sitemaps. This program has since, evolved into what we know as Google Webmaster Central. About the same time, Google bought Urchin and shortly after made Google Analytics free to everyone. Back then small to medium sized sites that couldn't afford enterprise analytics relied primarily on ranking reports to measure search visibility.

Ranking reports are created with software that emulates users and sends automated queries to search engines. The software then records data about positioning within organic search results by specific keywords and URLs. Ranking reports don't provide bounce rates but, they do provide an important metrics for measuring SEO ROI directly from Google SERPs. That being said, automated queries from ranking software are expensive for search engines to process and as a result they are a direct violation of search engine guidelines.

In 2010 Google introduced personalization features in organic search engine results. These personalized results are based on the user's history, previous query, IP address and other factors determined by Google. Over the past two years, Google's personalized search results have rendered ranking reporting software nearly useless.

Enter Analytics… Without accurate ranking reports, analytics may seem like a decent alternative tool for measuring SEO ROI by URL but, is that really the case? If analytics were enough why did Google recently update Google Webmaster Tools? These are a couple of the questions that I hope to answer.

To start off, let's establish a few laws of the land...

Google Webmaster Tools Update Case Study: Redirects

Experiment: To compare 301 and 302 reporting accuracy between Google Analytics and Google Webmaster Tools

Hypothesis: Google Analytics incorrectly attribute traffic when 301 and/or 302 redirects are present.

Background: Google ranks pages by URL, for that reason accurate reporting by specific URL is critical. In order for Google Analytics to record activity a page must load and Google Analytics JavaScript must execute. Google Analytics reports based on a page and not URL. While most "pages" have URLs not all URLs result in page views. This is the case when 301 and/or 302 server side redirected URLs appear in search results.

Procedure: For this comparison, I created apples.html and allowed it to be indexed by Google. I then created oranges.html and included noindex meta to prevent indexing until the appropriate time. After ranking in Google's SERPs, apples.html was 301 redirected to oranges.html and results recorded.

Result:
According to Google Analytics, oranges.html is driving traffic from Google SERPs via "apples" related keywords. Google Webmaster Tools on the other hand, reports each URL individually by keyword and remarks the 301 redirect.

Conclusion: Google Analytics reports oranges.html is indexed by Google and ranks in Google SERPs for apples.html related keywords. However reporting that data to clients would be a lie. Oranges.html hasn't been crawled by Google and isn't actually indexed in Google SERPs. Secondly, until Google crawls and indexes the URL oranges.html it is impossible to determine how or if it will rank in Google search results pages. In addition, this data becomes part of the historical record for both URLs and is calculated into bounce rates for URLs not shown in SERPs.

(Google's Caffeine has improve the situation for 301 redirects as time between discovery and indexing are reduced.)

Google Webmaster Tools Update Case Study: Complex redirects

Experiment: To compare differences in tracking via multiple redirects from SERPs ending on off-site pages.

Hypothesis: Multiple redirects ending off-site are invisible to Google Analytics because there is no page load.

Background: Google ranks pages by URL, for that reason accurate reporting by URL is critical. In order for Google Analytics to record activity a page must load and Google Analytics JavaScript must execute. While most "pages" have URLs not all URLs render pages. In most cases 301 issues are resolved by engines over time, however 302 issues will remain. The same is the case for multiple redirects ending off-site.

(For those who aren't aware, this is one way spammers try and trick Google into "crediting" their site with hundreds or thousands and sometimes even hundreds of thousands of content pages that actually belong to someone else.)

Procedure: To test how Google Analytics handles multiple redirects, I created page1.html which 302 redirects to page2.html which 301 redirects to another-domain.com. Google indexes the content from another-domain.com but SERPs show it as residing at the URL page2.html.

Result: Despite being ranked in SERPs, Google Analytics has no data for these URLs. Google Webmaster Tools reports the first two URLs and remarks redirects.

Conclusion: Google Webmaster Tools recognizes the existence of the URLs in question while Google Analytics doesn't at all and that is a major problem. For SEO reporting these URLs are critical, the content is real and it's impacting users as well as Google.

Google Webmaster Tools Update Case Study: Installation

Experiment: To compare tracking without Google Analytics tracking code installed.

Hypothesis: Google Analytics won't track if tracking code is not installed properly on each page within site architectures supporting analytics.

Background: In order for Google Analytics to record data it must be implemented correctly in each page and be able to communicate with Google. Legacy pages without the Google Analytics tracking code often rank in SERPs but, go unnoticed because they're invisible to analytics. In addition to this issue there are various other situations where untracked content appears in Google's index. Even when implemented properly, analytics tools are often prevented from reporting due to architectural problems.

Procedure: To test how Google Analytics works without proper installation, I setup an account but DID NOT implement the Google Analytics tracking code snippet into pages.

Result: Google Analytics reports that their has been no traffic and that the site had no pages but, Google Webmaster Tools reports as usual impressions by keyword, by URL, CTR and other.

Conclusion: In order to function properly Google Analytics must be implemented in each and every page and function properly in addition to being supported by the site architecture. Google Analytics requires extensive implementation in many cases which is an extra obstacle for SEO. Google Webmaster Tools data is direct from Google, requires no implementation and verification is easy.

Google Webmaster Tools Update Case Study: Site Reliability

Experiment: To see how Google Analytics tracks pages when a website goes offline.

Hypothesis: Google Analytics will not track site outages.

Background: In order for Google Analytics to record data it must be properly implemented, supported by the site's architecute and be able to communicate back and forth with Google.

Procedure: To test how Google Analytics reports when a site goes offline, I turned off a website with Google Analytics installed.

Result: Google Analytics reports no visitors and/or other metrics but suggests nothing about the real cause. Google Webmaster Tools - reports errors suggesting the site was down.

Conclusion: Google Analytics does not report site outages or outage error URLs whereas Google Webmaster Tools does. For SEO, site uptime is critical.

Final thoughts...

As illustrated above, analytics will report keywords for URLs that aren't indexed and won't report keywords for URLs that are indexed in SERPs. Analytics is unaware of redirected URLs even those indexed by Google and seen by users worldwide. Analytics can't tell the difference between a brief lack of visitors and periods of site downtime, it's possible for analytics tracking code to fire without pages loading and pages loading without firing tracking code. Analytics doesn't know framed content is indexed, or about legacy pages without tracking, alternative text versions of Flash pages, how long pages take to load, and on, and on, and on....

In fairness, the tool is doing what it is designed to do, folks using it just don't understand the limitations. Often times, they aren't aware data is fragmented and/or missing or that site architecture impact reporting ability. Checking Google to see if SERPs jive with reports never occurs for some reason.

I've been kvetching about these issues for years, to anyone and everyone who would listen. If you can't tell, few things F R U S T R A T E me more.

The case studies above represent just a few ways in which analytics data is skewed due to bad and/or missing data. Believe it or not, a substantial amount of analytics data is bogus. According to one Google Analytics Enterprise partner, 44% of pages with analtyics have analytics errors. On average analytics only tracks about 75% of traffic. Analytics is a weird beast, when something goes wrong nothing happens in analytics and sometimes it happens on invisible pages. :)

Bad data attacks like a virus from various sources polluting reporting exponentially, silently, undetected and over time. Sadly, very few folks including most "analytics experts" have the experience or expertise to track down issues like these by hand. Until now there has been no tool to report analytics not reporting. The recent Google Webmaster Tools update empowers webmasters by providing them with the best data available. This update exposes analytics issues. It also places the burden of proving data measurement accuracy back on the folks responsible for it.

Oh yeah, HAPPY NEW YEAR!

In case you missed it, Logitech finally released Revue for Google TV. While I'm really excited about the Revue launch and Google TV, I'm a little concerned by Google TV's website. I know, Google doesn't have the best "track record" when it comes things like SEM but, Google.com/tv is especially bad.

The new Google TV minisite has two duplicate "homepages", one in Flash and one not in Flash. This kind of duplicate content acts to thin both keyword relevancy and PageRank. In addition to thinning, Google filters non-malicious duplicate content from SERPs. Neither one of Google TV's homepages provide properly formatted meta descriptions in fact, the meta description for the non-Flash version is empty and has a different TITLE element than the Flash version. As far as content, neither one of Google TV's homepages provide accurate "alternative" textual content for users without Flash and/or JavaScript. For tracking they seem to be using a customized version of Google Analytics possibly even with heat mapping functionality but, it fires before the onload event which slows PageSpeed and degrades user experience.

Matt Cutts first mentioned speed publicly, as a potential ranking signal in November 2009 but, speed has always been important at Google. Google's homepage for example, is intentionally sparse so that it loads quickly. Larry Page recently said he wants to see pages "flip" online. Clearly the concept of speed and it's importance at Google is nothing new. Robert Miller, actually conducted the first research in this area over 40 years ago. According to Miller, when a computer takes more than one tenth of a second to load, the user feels less and less in control. When Google and Bing conducted their own tests in 2008 the results were similar to what Miller had predicted. Bing experienced a 1.8% reduction in queries when slowed by 2.0 seconds and Google experienced a 0.59% reduction in queries when slowed by 400 milliseconds. Bottom line, fast pages help marketers because users are far less likely abandon fast loading sites. Users expect pages to load quickly!

Page Speed was introduced as a ranking signal for English queries executed via Google.com nearly a month ago. It's defined as "the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser." In their first major update to Webmaster Guidelines in over a year, Google recommends webmasters monitor site performance and optimize page load times on a regular basis.

Because Page Speed data is missing from Analytics and other tools, it's best to use the Site Performance in Google Webmaster Tools for regular monitoring and optimization. To view Site Performance data in Google Webmaster Tools, you'll need to add and verify your site first. In Google Webmaster Tools, Site Performance data is processed in aggregate and without personally identifiable information. "Example" URLs are derived from actual Google Toolbar user queries and as a result query parameters are removed. "Suggestions" are based on URLs crawled by Google and not truncated. "Average" site load times are weighted on traffic. Site Performance data accuracy is based on the total number of data points, which can range from 1-100 data points (low) up to 1,000+ data points (high). As mentioned earlier, this data comes from Google Toolbar users with the PageRank feature enabled. While Google hasn't provided Toolbar PageRank user demographic information, this data seems fairly reliable. If anything, it would seem that Toolbar PageRank bias would point to more savvy users and that as a result Google Webmaster Tools Site Performance data might be faster than actual.

During our "Speed" session at SMX, Vanessa Fox and Maile Ohye (Google) seemed to agree that less than 4.0 seconds was a good rule of thumb but still slow. According to Google AdWords, "the threshold for a 'slow-loading' landing page is the regional average plus three seconds." No matter how you slice it, this concept is fairly complex. Page Speed involves code, imagery, assets and a lot more not to mention user perceptions about time. For example, the user's internet connection (Google Fiber), their browser (Chrome), device(Nexus One), its processor (Snapdragon), the site's host and other issues all impact user perceptions about speed. Don't worry though, according to most expert estimates, 80% to 90% of page response time is spent on the front-end. At the end of the day, relevance is critical but speed follows quickly behind.

Speed Resources: