analytics

It's becoming more and more clear that ranking reports are no longer reliable. Users are noticing personalized SERPs more and more and they're catching on to obvious inaccuracies generated by traditional ranking report software. These inaccuracies are caused by differences in query IP, query data, account status, web history, personalized settings, social graph and/or other. As a result, there is a growing shift away from rank report software to analytics for accurate SEO measurement.

Prior to personalized search results, SEO relied heavily on ranking reports in order to measure SEO campaign performance. SEOs create "ranking reports" with software that submits automated queries directly to search engines, a.k.a. "scrapes search engine results." Despite the fact that automated queries are against Google Webmaster Guidelines, waste energy and cost Google millions of dollars each year to process, scraping search engine results is still a popular practice. Obviously it’s in the engines best interest to take steps to prevent these queries.

Analytics software on the other hand is different, it works independently of search engines. Analytics relies heavily on code embedded within pages as well as human interpretation of data. Until recently, analytics software has been used only to “tell a story,” but not for the precise measurement SEO requires. Site analysis focuses on trending and establishing a “comfort level” with data determined to be "good enough" by the analytics specialist. Analytics platforms are designed for anyone to use, specialist and non-specialist alike. In many cases, analytics specialist themselves have little analytics experience, expertise, knowledge about how search engines work or an understanding of searcher intent. How can we expect anything different, when WAA itself still doesn’t teach things like transactional queries?

"To optimize scent trails, make sure that when the intent is transparent, the scent trail on any chosen term matches that intent. It doesn't matter if the trail starts with PPC (pay-per-click) or organic search. Prospects usually hope to find one of two things: the answer they seek or a link that takes them to the answer."

- The Web Analytics Association "Knowledge Required for Certification" (also available in non-www version)

Analytics tracking code is usually implemented by URL without consideration for user path, intent, source or origination. In most cases the implementation is performed by someone other than the analytics specialist interpreting the data. According to some estimates as many as 45% of pages implemented with Google Analytics contain errors. Conversions from organic SERPs are the most difficult to track back to the original referrer. To compound that problem, site issues often prevent even flawless analytics implementations from reporting. Analytics failures are costly, often go unnoticed and undetected because NOTHING is in place to report when analytics doesn't report.

Quick examples & thoughts:
- Even if Avinash himself, implements Omniture and Google Analytics tracking code on every page of your site, users entering from SERPs via 301 or 302 redirect won’t be attributed as “Organic.” According to Google, "If your site uses redirects, the redirecting page becomes the landing page's referrer. For example, if a user searches for a specific keyword, such as 'used books' on Google, there will not be any referring data showing that this keyword was searched on Google. Instead, the referral will be shown as the redirecting page."

- High traffic major converters or blank pages that send users to a competitor? Either way, nobody will ever know because these pages lack analytics tracking code. URL naming conventions for most sites follow a specific pattern. Use the site operator to quickly spot check for URLs that seem out of the ordinary to be certain they include analytics tracking code implementation and aren't redirected. It's pretty common to find legacy pages from older versions of sites indexed.

SEO Analytics

- If these folks are quick evaluators, analytics tracking code might not execute before a new page loads and this SEO conversion might be credited somewhere else. Analytics won't measure landing page load time even though it's a highly important metric for users. Flash or otherwise, pages like these always have issues when it comes to tracking organic conversions.

SEO Analytics

- If your site goes down chances are you'll never know because analytics reporting goes down as well. Using a website monitoring service is always a good idea, just to be sure that conversions really are down and not your entire site.

Takeaways, until SEO expectations are more clear to the analytics community, SEOs should insist on performing SEO analytics audits as usual. When hiring analytics specialists, look for applicants who are willing to address websites from the user perspective and outside of analytics. Folks willing to question data accuracy and those able to identify analytics obstacles are highly desired. Key being, SEO is as concerned with what analytics is tracking as it is about what analytics should be tracking.

As you know Google recently caused havoc for some analytics platforms by including #anchors (aka pound signs, fragment identifiers) in SERP URLs. Just a quick post today to mention they're back! Noticed #anchors in my SERPs this morning and I've been seeing them on and off all day.

Update - Matt Cutts confirmed this issue last week but I haven't seen any hints of the new testing until now. Hat tip to Barry for the 411! :) Thanks to Ionut Alex Chitu for emailing to also point out that Google's new Wonder Wheel feature launched earlier this week, always uses AJAX URLs with #anchors. It's going to be interesting to see what impact this change has on various tools, something to be aware of for sure....

A friend of mine recently emailed to ask, how TinyURL impacts SEO? It's a good question and one many folks can't answer so, I thought I'd blog my answer to his question!

For anyone not familiar with TinyURL, in layman terms it's a tool where users can enter long displaying URLs to get a shortened version. TinyURLs are often used where long URLs might wrap and therefore break, such as in email or social media web applications like Twitter. In more technical terms, TinyURLs are short, dynamically created URLs that redirect users to another intended URL via 301 redirect. Because TinyURLs "301" or permanently redirect, search engines should not index the TinyURL but instead should index and pass PageRank to the actual URL.

It is important to note, TinyURLs to paid links passing PageRank is a violation of Google Webmaster Guidelines and that sites like Twitter use nofollow techniques to prevent spam.

On their own, TinyURLs can be search engine friendly from a technical perspective. At the same time, I wouldn't suggest replacing your site's navigation with TinyURLs and would point out that tracking TinyURLs via analytics might be difficult.

In case you missed it, I recently posted that Google's "Average Number of Words Per Query have Increased!" In response, JohnW raised a very good question about how Google calculated this number and whether or not it's a rounded number.

Avinash Kaushik, Analytics Evangelist for Google and author of the book "Web Analytics: An Hour A Day" responded to my request for clarification on this topic via email earlier today. I asked Avinash how Google calculated this figure and if Google rounded up, for example if 3.?? was rounded to 4 or if the actual value is equal to or greater than 4?

Here is what Avinash said:
"I believe that the right term is "average number of words in a query" are 4. I did not get enough clarity if it is 3.6 or 4.2. But none the less it was a movement from 3 to 4."

So in answer to JohnW's question it does seem likely that this number is round. Either way, the number is officially 4!

Thanks Avinash & JohnW!