Tag Archives: Google

Hacking is on the rise! It seems like there are a lot of newly hacked pages unintentionally cloaking. These pages contain keyword rich links and redirect users to other websites. The sites being impacted aren't the typical mom and pop sites either. This week I've personally reached out to one major university, an international religious organization and an NFL franchise, to let them know their site was hacked, cloaking links and redirecting users. First though, I reported the site behaving badly via Google's new spam report tool.

Dr Evil

Dr. Evil courtesy of http://www.stefanosalvadori.it

Google does a great job of trying to notify webmasters when they think a site is compromised but, documentation about cloaking is sparse. Sure I get it, Google doesn't want to give bad guys detailed instructions for doing bad things. I've intentionally left some facets out of this post for exactly that reason. My point however is, better information could help empower good guys who are unknowingly the victims of evil. Some of the webmasters I talk with can't identify the issue even after it is laid out in front of them. (For those who aren't aware cloaking is against Google's webmaster guidelines and can cause your site to be banned. "So, please don't try this at home!")

According to Google, cloaking is "presenting different content or URLs to users and search engines." "Presenting" different content itself, isn't the key take away though, when it comes to cloaking. For example, it is not considered cloaking if you "present" an image containing human-readable text (jpg, gif, png) that differs from it's ALT attribute. Flash on the other hand is different because "alternative content" is in most cases served up from the server. Cloaking is specifically when a site returns different content from the server.

Cloaking comes in several different flavors but, in order to identify any of them you'll need to understand each type as well as how they all work together. You won't find identifiable traces of cloaking via the free version of Google analytics or other paid programs. Google's documentation talks about "Serving up different results based on user agent." They don't really go into detail about the other types of cloaking which are by referrer and by IP. Believe it or not, it's actually possible to cloak by user-agent, referrer, IP and/or any combination of the three.

Here are a few simple easy ways to determine whether or not a site is cloaking:

Fetch as Googlebot - If you think your site may be hacked and is currently cloaking, test it with "Fetch as Googlebot" in Google's webmaster tools. Fetch as Googlebot uses user-agent Googlebot and comes from Google's IP range. The information Fetch as Googlebot provides is as close as you'll get to what Google actually "sees". Compare the page source code from "Fetch as Googlebot" with the page source code from your page and be sure they match.

Google's snippet - If you think a page is cloaking and don't have access to Fetch as Googlebot in Google's webmaster tools, go to Google's preferences, turn off Google Instant search results and increase your number of results to 100. Then perform an advanced operator query for the site. Browse through several search result pages looking for anything odd, especially in Google's snippet of the page TITLE element. If you don't see anything odd in search results, click on a few links in the search results to be sure you end up on the correct page.

Google's cache and cached text versions - After checking out Google's snippets for anything out of the ordinary and clicking links to be sure the destination URL is correct, check out Google's cached copy and cached text versions of your pages. You want to look for links, Javascript, rel=canonical attributes or anything else that doesn't belong, especially near the bottom of the page in the cached text version. Also, be sure Google has cached the correct URL in their cached page.

User-Agent Switcher - This Firefox add-on will allow you to change your user-agent to Googlebot. While using user-agent Googlebot as your browser's user-agent, compare the source code of your pages to it for any differences. Important note, User-Agent Switcher allows you to change your own user agent but, doesn't allow access via a Google IP like Fetch as Googlebot mentioned above. If a site is cloaking by user-agent and IP you may not be able to see it using this add-on.

There has been lots of talk about the 8GB card in HTC's new Google EVO (two different cards were provided to IO attendees). It will be interesting to see how the HTC EVO is received Friday when the device finally hits store shelves. I've been using my EVO for a few weeks now and really like it. I've tried to replicate the SD card issue mentioned by others but so far haven't been able to on my EVO no matter what I try. Maybe it's Windows related not sure but, either way I wouldn't be too concerned ;) .


EVO is the first 4G mobile device and sports the supersonic 1GHZ Snapdragon processor. It's so fast it needs a kickstand and has one build in the back. While EVO is a phone and not a tablet (though nobody can provide a clear definition) it's pretty large. EVO's battery in fact is largest I've seen in an HTC yet. I like the hot spot feature and look forward to having 2.2 up and running on my EVO. My favorite feature is EVO's camera, it can take photos from either side of the phone. If you need a new phone, check out EVO.

HTC EVO Back Cover Removed

It's becoming more and more clear that ranking reports are no longer reliable. Users are noticing personalized SERPs more and more and they're catching on to obvious inaccuracies generated by traditional ranking report software. These inaccuracies are caused by differences in query IP, query data, account status, web history, personalized settings, social graph and/or other. As a result, there is a growing shift away from rank report software to analytics for accurate SEO measurement.

Prior to personalized search results, SEO relied heavily on ranking reports in order to measure SEO campaign performance. SEOs create "ranking reports" with software that submits automated queries directly to search engines, a.k.a. "scrapes search engine results." Despite the fact that automated queries are against Google Webmaster Guidelines, waste energy and cost Google millions of dollars each year to process, scraping search engine results is still a popular practice. Obviously it’s in the engines best interest to take steps to prevent these queries.

Analytics software on the other hand is different, it works independently of search engines. Analytics relies heavily on code embedded within pages as well as human interpretation of data. Until recently, analytics software has been used only to “tell a story,” but not for the precise measurement SEO requires. Site analysis focuses on trending and establishing a “comfort level” with data determined to be "good enough" by the analytics specialist. Analytics platforms are designed for anyone to use, specialist and non-specialist alike. In many cases, analytics specialist themselves have little analytics experience, expertise, knowledge about how search engines work or an understanding of searcher intent. How can we expect anything different, when WAA itself still doesn’t teach things like transactional queries?

"To optimize scent trails, make sure that when the intent is transparent, the scent trail on any chosen term matches that intent. It doesn't matter if the trail starts with PPC (pay-per-click) or organic search. Prospects usually hope to find one of two things: the answer they seek or a link that takes them to the answer."

- The Web Analytics Association "Knowledge Required for Certification" (also available in non-www version)

Analytics tracking code is usually implemented by URL without consideration for user path, intent, source or origination. In most cases the implementation is performed by someone other than the analytics specialist interpreting the data. According to some estimates as many as 45% of pages implemented with Google Analytics contain errors. Conversions from organic SERPs are the most difficult to track back to the original referrer. To compound that problem, site issues often prevent even flawless analytics implementations from reporting. Analytics failures are costly, often go unnoticed and undetected because NOTHING is in place to report when analytics doesn't report.

Quick examples & thoughts:
- Even if Avinash himself, implements Omniture and Google Analytics tracking code on every page of your site, users entering from SERPs via 301 or 302 redirect won’t be attributed as “Organic.” According to Google, "If your site uses redirects, the redirecting page becomes the landing page's referrer. For example, if a user searches for a specific keyword, such as 'used books' on Google, there will not be any referring data showing that this keyword was searched on Google. Instead, the referral will be shown as the redirecting page."

- High traffic major converters or blank pages that send users to a competitor? Either way, nobody will ever know because these pages lack analytics tracking code. URL naming conventions for most sites follow a specific pattern. Use the site operator to quickly spot check for URLs that seem out of the ordinary to be certain they include analytics tracking code implementation and aren't redirected. It's pretty common to find legacy pages from older versions of sites indexed.

SEO Analytics

- If these folks are quick evaluators, analytics tracking code might not execute before a new page loads and this SEO conversion might be credited somewhere else. Analytics won't measure landing page load time even though it's a highly important metric for users. Flash or otherwise, pages like these always have issues when it comes to tracking organic conversions.

SEO Analytics

- If your site goes down chances are you'll never know because analytics reporting goes down as well. Using a website monitoring service is always a good idea, just to be sure that conversions really are down and not your entire site.

Takeaways, until SEO expectations are more clear to the analytics community, SEOs should insist on performing SEO analytics audits as usual. When hiring analytics specialists, look for applicants who are willing to address websites from the user perspective and outside of analytics. Folks willing to question data accuracy and those able to identify analytics obstacles are highly desired. Key being, SEO is as concerned with what analytics is tracking as it is about what analytics should be tracking.