analytics

Why don't analytics PageSpeed numbers match PageSpeed scores?

Why don't analytics PageSpeed scores match the PageSpeed tool?

According to Google Analytics, the PageSpeed score for the page above is 88/100 but, in reality PageSpeed for this page is 64/100 for mobile users and 77/100 for desktop users. Don't be a sucker for analytics data dung! The only source for accurate up-to-date PageSpeed data is Google's NEW AND IMPROVED PageSpeed Insights tool. PageSpeed data from analytics and other sources is not always accurate or updated.

How fast should pages load?

As Matt Cutts recently pointed out, websites perform differently in different parts of the world. Ideally pages should load faster than the median load time in the country or region they target.

Where do I find the median page load time for my country or region of the world?

2013 Median Page Load Times: North America
- US 2.4 seconds desktop / 2.6 seconds mobile
- Canada 2.4 seconds desktop / 3.6 seconds mobile
- Mexico 3.8 seconds desktop / 4.5 seconds mobile
- Cuba 17.5 seconds desktop / 4.5 seconds mobile
- Bahamas 3.3 seconds desktop / 4.5 seconds mobile

2013 Median Page Load Times: Europe
- Czech Republic 1.6 seconds desktop / 3.4 seconds mobile
- Netherlands 1.8 seconds desktop / 3.1 seconds mobile
- Sweden 1.8 seconds desktop / 3.2 seconds mobile
- Russia 2.4 seconds desktop / 4.8 seconds mobile
- Germany 2.5 seconds desktop / 3.0 seconds mobile
- UK 2.5 seconds desktop / 3.6 seconds mobile
- Poland 2.7 seconds desktop / 4.7 seconds mobile
- Italy 3.3 seconds desktop / 5 seconds mobile
- Spain 3.2 seconds desktop / 5.3 seconds mobile

2013 Median Page Load Times: Asia
- South Korea 1.4 seconds desktop / 1.7 seconds mobile
- Japan 1.8 seconds desktop / 3.0 seconds mobile
- Russia 2.4 seconds desktop / 4.8 seconds mobile
- China 2.5 seconds desktop / 3.7 seconds mobile
- Viet Nam 2.5 seconds desktop /4.5 seconds mobile
- Thailand 3.7 seconds desktop / 5.8 seconds mobile
- Indonesia 7.4 seconds desktop / 5.1 seconds mobile
- India 5.1 seconds desktop / 5.8 seconds mobile
- Saudi Arabia 4.0 seconds desktop / 6.7 seconds mobile
- Pakistan 6.4 seconds desktop / 8.0 seconds mobile
- Iraq 5.5 seconds desktop / 5.9 seconds mobile
- Iran 6.1 seconds desktop / 9.5 seconds mobile
- Syria 8.1 seconds desktop / 9.1 seconds mobile

2013 Median Page Load Times: South America
- Chile 4.0 seconds desktop / 5.5 seconds mobile
- Brazil 4.7 seconds desktop / 7.7 seconds mobile
- Peru 4.3 seconds desktop / 8.5 seconds mobile
- Argentina 5.3 seconds desktop / 7.3 seconds mobile

2013 Median Page Load Times: Australia
- Australia 3.5 seconds desktop / 4.4 seconds mobile

2013 Median Page Load Times: Africa
- Morocco 3.5 seconds desktop / 5.0 seconds mobile
- South Africa 4.8 seconds desktop / 5.3 seconds mobile
- Algeria 5.1 seconds desktop / 7.8 seconds mobile
- Egypt 5.9 seconds desktop / 7.7 seconds mobile
- Kenya 7.7 seconds desktop / 11.4 seconds mobile

The list above provides the most recent median page load times as of 2013 from Google.

How do I compare pages with the same PageSpeed score to see which loads faster?

PageSpeed Insights is a great general purpose litmus test for improving PageSpeed but it only considers the network-independent aspects of page performance. To get down and dirty with respect to network performance and other speed related issues, you need to experience load times from the user perspective. To actually time pages via different browsers from various locations, use tools like WebPageTest.org or Pingdom. For instance, let's compare real load times, PageSpeed and speed index numbers for pages with Google Analytics, an optimized version of Google Analtyics and Google Tag Manager.

Load Time / Speed Index / PageSpeed
- empty page .461 seconds / 400 / 100/100
- custom analytics .619 seconds / 600 / 100/100
- standard analytics .808 seconds / 800 / 100/100
- tag manager .881 seconds / 900 / 100/100

Result: All of the URLs tested above have the same PageSpeed score of 100/100. However, from a user perspective the empty page has the best Speed Index score and loaded fastest.

Why are pages with asynchronous JavaScript slower than pages without JavaScript?

1. Not all browsers support asynchronous attributes.

2. When asynchronous scripts arrive during page load, browsers have to stop rendering the page in order to parse and execute scripts.

Even small things, like white space and HTML comments decrease page performance and increase load times. Scripts with ASYNC attributes like social media buttons and analytics tracking codes increase load times from a user perspective. It is always best to avoid including any unnecessary code or scripts even if they include the ASYNC attribute. Asynchronous scripts still impact performance.

 

Google Encrypted Search

Some marketers were caught by surprise in 2011 when Google rolled out secure and encrypted search results for users signed in to a Google account. By encrypting search queries and other Google traffic, intermediaries with network access can't intercept meaningful information while in transit between users and Google. In addition to preventing third party eavesdropping, encrypted results prevents a minor percentage of keyword level data that was previously reported by analytics from being captured. That said, Google personalizes search results differently for users when signed in to a Google account than when they are not. Because Google personalizes search results differently when users are signed in to a Google account, keyword data for signed in users should never have been reported by analytics as organic.

Contrary to statements from Google however, some marketers are reporting double-digit declines in terms of loss of keyword data. If true, this could be a major problem for individuals and agencies with business models based on performance. Because I like to verify other people's claims for myself and even help out when I can, I have been digging deep into sites reporting more than a 10% loss of keyword data since Google launched encrypted search. Over the past several months I have looked at various sites, ranging from leading SEO and analytics industry expert sites with millions of pages indexed in Google search results to a mom and pop site with 50 pages indexed. So far, I have found many sites with numerous pages lacking analytics tracking code and/or site issues which prevent tracking.

Findings like these are no real surprise, according to Google Analytics, on average 37% of traffic is "direct" traffic. "Direct" is actually a bit of a misnomer, in many cases it is where you will find traffic from other sources patiently waiting to be discovered. Some analytics companies provide tools that crawl sites to check pages for analytics tracking code. These tools are not effective for ensuring that pages indexed by search engines contain analytics tracking code. Unlike analytics tools to crawl a site, search engines crawl hundreds of billions of pages and often find links to sites outside of the current site architecture. Sites with pages indexed that have no analytics tracking code are not measuring organic search performance. Currently there is no tool to check URLs indexed in Google search results for analytics tracking code.

Marketers who are trying to make up for losses in keyword conversion data due to Google encrypted search should focus on decreasing direct traffic to 25% or less. They should also focus on ensuring that all pages indexed in search results are tracked by and reported by analytics. To ensure that pages are tracked properly, webmasters should cross reference URLs tracked by analytics with URLs reported in Google Webmaster Tools. People can say what they want but Google Webmaster Tools provides all of the searches for 96% of websites. Each site is different but mathematically for a lot of sites, it is possible to decrease the percentage of keyword data lost due to Google encrypted search results by increasing the overall number of pages tracked. Either way, this is a short term technique to make up for keyword data that is currently being lost. Those impacted by this loss of keyword data should adjust and prepare for more future losses in the future.

With the "year of mobile" finally behind us, you can expect 2012 to be the "year of encryption". Google voice search is part of the rationale behind Google defaulting signed in users to encrypted search. Unlike desktop computers and Chromebooks however, mobile users can be signed in to services like GMail via GMail's mobile application for example, without being signed in to Google search on their mobile browser. While not enabled by default, the mobile version of encrypted search is already available by accessing https://encrypted.google.com on a mobile device. Users with security concerns can already set Google encrypted search as their default mobile browser homepage via browser > settings > homepage. Newer Google Motorola Android devices like the Android Bionic by Motorola, actually allow users to encrypt the entire device including all contents. Defaulting signed in users to encrypted search across the board, during the busy holiday shopping season could be highly problematic for a number of reasons. Don't expect to see encryption for signed in users fully rolled out to all devices in 2011 but expect it in 2012.

Over the holidays, Google rolled out a pretty major update to Webmaster Tools. This latest update provides much more detail in terms of data and reporting. So much in fact, that some folks seem confused now about the difference between Google Webmaster Tools and Google Analytics. The big difference for SEO is that Google Webmaster Tools shows Google's own data for URLs in Google SERPs and doesn't track web pages like Google Analytics. In addition to the key difference in reporting, Google Webmaster Tools requires no installation. While it's difficult to say for sure, this update should force folks to abandon the ignorance is bliss mentality when it comes to analytics reporting once and for all.

BUT before diving in, here is a little background...

In 2005, with a little help from a Googler named Vanessa Fox, Google launched Google Sitemaps. This program has since, evolved into what we know as Google Webmaster Central. About the same time, Google bought Urchin and shortly after made Google Analytics free to everyone. Back then small to medium sized sites that couldn't afford enterprise analytics relied primarily on ranking reports to measure search visibility.

Ranking reports are created with software that emulates users and sends automated queries to search engines. The software then records data about positioning within organic search results by specific keywords and URLs. Ranking reports don't provide bounce rates but, they do provide an important metrics for measuring SEO ROI directly from Google SERPs. That being said, automated queries from ranking software are expensive for search engines to process and as a result they are a direct violation of search engine guidelines.

In 2010 Google introduced personalization features in organic search engine results. These personalized results are based on the user's history, previous query, IP address and other factors determined by Google. Over the past two years, Google's personalized search results have rendered ranking reporting software nearly useless.

Enter Analytics… Without accurate ranking reports, analytics may seem like a decent alternative tool for measuring SEO ROI by URL but, is that really the case? If analytics were enough why did Google recently update Google Webmaster Tools? These are a couple of the questions that I hope to answer.

To start off, let's establish a few laws of the land...

Google Webmaster Tools Update Case Study: Redirects

Experiment: To compare 301 and 302 reporting accuracy between Google Analytics and Google Webmaster Tools

Hypothesis: Google Analytics incorrectly attribute traffic when 301 and/or 302 redirects are present.

Background: Google ranks pages by URL, for that reason accurate reporting by specific URL is critical. In order for Google Analytics to record activity a page must load and Google Analytics JavaScript must execute. Google Analytics reports based on a page and not URL. While most "pages" have URLs not all URLs result in page views. This is the case when 301 and/or 302 server side redirected URLs appear in search results.

Procedure: For this comparison, I created apples.html and allowed it to be indexed by Google. I then created oranges.html and included noindex meta to prevent indexing until the appropriate time. After ranking in Google's SERPs, apples.html was 301 redirected to oranges.html and results recorded.

Result:
According to Google Analytics, oranges.html is driving traffic from Google SERPs via "apples" related keywords. Google Webmaster Tools on the other hand, reports each URL individually by keyword and remarks the 301 redirect.

Conclusion: Google Analytics reports oranges.html is indexed by Google and ranks in Google SERPs for apples.html related keywords. However reporting that data to clients would be a lie. Oranges.html hasn't been crawled by Google and isn't actually indexed in Google SERPs. Secondly, until Google crawls and indexes the URL oranges.html it is impossible to determine how or if it will rank in Google search results pages. In addition, this data becomes part of the historical record for both URLs and is calculated into bounce rates for URLs not shown in SERPs.

(Google's Caffeine has improve the situation for 301 redirects as time between discovery and indexing are reduced.)

Google Webmaster Tools Update Case Study: Complex redirects

Experiment: To compare differences in tracking via multiple redirects from SERPs ending on off-site pages.

Hypothesis: Multiple redirects ending off-site are invisible to Google Analytics because there is no page load.

Background: Google ranks pages by URL, for that reason accurate reporting by URL is critical. In order for Google Analytics to record activity a page must load and Google Analytics JavaScript must execute. While most "pages" have URLs not all URLs render pages. In most cases 301 issues are resolved by engines over time, however 302 issues will remain. The same is the case for multiple redirects ending off-site.

(For those who aren't aware, this is one way spammers try and trick Google into "crediting" their site with hundreds or thousands and sometimes even hundreds of thousands of content pages that actually belong to someone else.)

Procedure: To test how Google Analytics handles multiple redirects, I created page1.html which 302 redirects to page2.html which 301 redirects to another-domain.com. Google indexes the content from another-domain.com but SERPs show it as residing at the URL page2.html.

Result: Despite being ranked in SERPs, Google Analytics has no data for these URLs. Google Webmaster Tools reports the first two URLs and remarks redirects.

Conclusion: Google Webmaster Tools recognizes the existence of the URLs in question while Google Analytics doesn't at all and that is a major problem. For SEO reporting these URLs are critical, the content is real and it's impacting users as well as Google.

Google Webmaster Tools Update Case Study: Installation

Experiment: To compare tracking without Google Analytics tracking code installed.

Hypothesis: Google Analytics won't track if tracking code is not installed properly on each page within site architectures supporting analytics.

Background: In order for Google Analytics to record data it must be implemented correctly in each page and be able to communicate with Google. Legacy pages without the Google Analytics tracking code often rank in SERPs but, go unnoticed because they're invisible to analytics. In addition to this issue there are various other situations where untracked content appears in Google's index. Even when implemented properly, analytics tools are often prevented from reporting due to architectural problems.

Procedure: To test how Google Analytics works without proper installation, I setup an account but DID NOT implement the Google Analytics tracking code snippet into pages.

Result: Google Analytics reports that their has been no traffic and that the site had no pages but, Google Webmaster Tools reports as usual impressions by keyword, by URL, CTR and other.

Conclusion: In order to function properly Google Analytics must be implemented in each and every page and function properly in addition to being supported by the site architecture. Google Analytics requires extensive implementation in many cases which is an extra obstacle for SEO. Google Webmaster Tools data is direct from Google, requires no implementation and verification is easy.

Google Webmaster Tools Update Case Study: Site Reliability

Experiment: To see how Google Analytics tracks pages when a website goes offline.

Hypothesis: Google Analytics will not track site outages.

Background: In order for Google Analytics to record data it must be properly implemented, supported by the site's architecute and be able to communicate back and forth with Google.

Procedure: To test how Google Analytics reports when a site goes offline, I turned off a website with Google Analytics installed.

Result: Google Analytics reports no visitors and/or other metrics but suggests nothing about the real cause. Google Webmaster Tools - reports errors suggesting the site was down.

Conclusion: Google Analytics does not report site outages or outage error URLs whereas Google Webmaster Tools does. For SEO, site uptime is critical.

Final thoughts...

As illustrated above, analytics will report keywords for URLs that aren't indexed and won't report keywords for URLs that are indexed in SERPs. Analytics is unaware of redirected URLs even those indexed by Google and seen by users worldwide. Analytics can't tell the difference between a brief lack of visitors and periods of site downtime, it's possible for analytics tracking code to fire without pages loading and pages loading without firing tracking code. Analytics doesn't know framed content is indexed, or about legacy pages without tracking, alternative text versions of Flash pages, how long pages take to load, and on, and on, and on....

In fairness, the tool is doing what it is designed to do, folks using it just don't understand the limitations. Often times, they aren't aware data is fragmented and/or missing or that site architecture impact reporting ability. Checking Google to see if SERPs jive with reports never occurs for some reason.

I've been kvetching about these issues for years, to anyone and everyone who would listen. If you can't tell, few things F R U S T R A T E me more.

The case studies above represent just a few ways in which analytics data is skewed due to bad and/or missing data. Believe it or not, a substantial amount of analytics data is bogus. According to one Google Analytics Enterprise partner, 44% of pages with analtyics have analytics errors. On average analytics only tracks about 75% of traffic. Analytics is a weird beast, when something goes wrong nothing happens in analytics and sometimes it happens on invisible pages. :)

Bad data attacks like a virus from various sources polluting reporting exponentially, silently, undetected and over time. Sadly, very few folks including most "analytics experts" have the experience or expertise to track down issues like these by hand. Until now there has been no tool to report analytics not reporting. The recent Google Webmaster Tools update empowers webmasters by providing them with the best data available. This update exposes analytics issues. It also places the burden of proving data measurement accuracy back on the folks responsible for it.

Oh yeah, HAPPY NEW YEAR!