News

The screenshots below are from a video produced by Yelp.com and TripAdvisor.com. They show what Google local search results would look like if powered by local review sites. According to the study published at the website FocusOnTheUser.eu (FOTU), 23% of users prefer local search results powered by local review sites. What do you think?

Here is an example of FOTU's proposed local results for the query [hotel bilbao spain].
Yelp TripAdvisor EU proposed results

As you can see, local results powered by local review sites lack important details local searchers want. Because these results use relevancy algorithms instead of localized algorithms, the wrong kinds of businesses and businesses in the wrong location will appear frequently.

Here is another example from the FOTU website. In this case Google local results for [pediatrician nyc] powered by Zocdoc.com.
ZocDoc FocusOnTheUser.eu results

Clearly the results in the screenshots above are not better for local searchers than Google local search results. How did FocusOnTheUser.eu reach the opposite conclusion?

The study is based on results from previous studies. One study was debunked earlier this year by SearchEngineLand.com's Founding Editor Danny Sullivan. The other study found no significant increases when local results were setup like the ones in the FOTU video.

Usability experts and Google agree, when you know the answer to a user's query it is best not to make users "click anywhere." That is the idea behind Google "instant" search results and why Google does not make users click local results when it "knows" what the user wants. When users mouse over local search results at Google, the local information users want is displayed on the right side of the page.

Google Local SERPS

It is impossible to compare Google local results without testing page functionality. FOTU did not address, test or measure Google local page functionality. The site claims to "preserve" a way for users not to have to click to have to find information but provides no guidance for finding missing details. Even after clicking the links from the video, I could not find phone numbers or other important details. Instead of testing how Google local search results actually work, FOTU tested something else.

Instead of measuring success with user focused metrics, FOTU measured success as increased click through rates. The tools they used will tell you what users did but not what they intended to do. Users typically do not click away from information they intend to find so clicks are not always the best measure of utility.

At 5:50 seconds into the video, FocusOnTheUser.eu presents the results of its study. According to the video, the site proved local searchers prefer the proposed results over Google by 23%. These results are based on a "significant increase" in "click engagement" but where?

A number of clicks went to links created by the tool used for the study. These links do not exist in real world local search results.

Some clicks went to a cafe in a city 18 miles away. Other clicks went to a cafe clearly marked as being CLOSED. Still other clicks went to a duplicate of the first listing with different ratings and reviews. When you add all of these clicks together and add clicks caused by missing map pins you come up with almost 23%.

According to the site, ratings and reviews are especially important to users. Ratings and reviews are important but only when they are for things users want to find. FOTU claims that having reviews and ratings exclusively provided by Google raises critical questions. That said none of the data provided appears to reflect any clicks for ratings or reviews. In fact, most if not all of the clicks shown went to local business websites.

FOTU claims "Google promotes search results drawn from Google+ ahead of the more relevant ones you would get from using Google's organic search algorithm." I will talk about that more in a minute but the FOTU widget proves that Google does not promote Google+ ahead of other results. The FOTU widget does however exclude all sites other than Google and local review websites.

FOTU wants Google to use standard relevancy algorithms for localized search results but Google local search results are based primarily on relevance, distance and prominence," not just relevance. FOTU provides a widget to demonstrate the differences between Google's search algorithm and the Google Maps algorithm used for local search queries. I used the widget for the screen shots below. As shown below, using standard algorithms for localized search queries is not in the best interest of users.

FOTU results

Yelp.com, TripAdvisor.com and other local review websites are not local businesses and do not have local business addresses. For that reason these sites probably should not rank ahead of true local businesses. The FOTU widget essentially excludes all local business websites from appearing in local search results.

At the end of the day, these local sites want a "single conspicuous" link "directly to" their site from Google local results. It is important to remember that these are the same sites that blocked Google in the past and forced Google to invest billions of dollars in local search. Now they want Google to change things for them even though nothing is to stop them from doing the same thing again.

Before governments force companies to change things based on allegations from potential competitors, I think it is important for an unbiased investigation be conducted.

Be sure to check out Steve Souders's latest blog post. In it, he stresses the importance of deferring JavaScript until after a pages have rendered and all the work that still needs to be done when it comes to high performance JavaScript.

Google made "asynchronous" the new marketing industry BUZZ word for 2010 when they rolled out an asynchronous version of Google Analytics. Asynchronous scripts are still just scripts after all and not bulletproof. Asynchronous Google Analytics isn't an open license to do as you please. Over the past year, I've noticed a major increase in the number of "mavericky" asynchronous Google Analytics implementations. When implemented properly Google Analytics is a great tool but implementation is critical.

Simply adding ASYNC attributes doesn't "make" scripts asynchronous. "_gaq" is actually what makes Google Analytics ASYNC syntax possible. Unfortunately, few browsers support the ASYNC attribute. Either way, ASYNC scripts are executed upon response arrival and not deferred which can result in blocking. DEFER attributes on the other hand, can block the onload event and also decrease PageSpeed. Another point to consider when trying to get content in front of users more quickly is, "If asynchronous scripts arrive while the page is loading, the browser has to stop rendering in order to parse and execute those scripts."

Bottom line, "mavericky" implementations can actually have a negative impact on user experience. Even worse, this data can be missing from both analytics and the Google Webmaster Tools site performance tab depending on how onload event firing is impacted. Oh yeah, and don't forget rankings! Matt Cutts said, Google Analytics doesn't impact rankings because when properly implemented it waits to load scripts until after the onload event but, that may not be the case if improperly implemented. Maile Ohye has confirmed that one of the ways Google calculates performance is via the onload event. According to Google, "To ensure the most streamlined operation of the asynchronous snippet with respect to other scripts," the asynchronous snippet should either be placed just before the close of the HEAD tag or just before the close of the BODY tag in your (X)HTML document. I'd suggest not taking any chances this Holiday season because this year speed is more important than ever before and testing.

Adobe recently submitted a US Patent application that relates to SEO for Flash / Flex, titled "EXPOSING RICH INTERNET APPLICATION CONTENT TO SEARCH ENGINES." Believe it or not, this Patent application claims "shadowing" techniques like SWFObject and SOFA are at an "obvious disadvantage" for search. According to Adobe, shadowing textual content in rich Internet applications with textual content in (X)HTML results in duplication and other issues. For those not aware, duplicate content thins keyword relevancy, Google's secret sauce, PageRank and requires a "duplication of effort" in producing "the actual rich Internet application as well as the shadow HTML." This Patent claims site management time is also increased because "changes in the rich Internet application must also be made to the shadow HTML, if that HTML code is to remain consistent with the rich Internet application."

To address these and other issues, Adobe's application proposes an invention that returns different content to users and search engines. According to the Patent application, content will be "available through a rich Internet application to search engine queries" via a "translation module" that interfaces "between a Web crawler and a rich Internet application." It seems this application isn't intended to provide alternative textual "eye wash" for users, but instead descriptions of the state, content and identifying URLs that are "important to Web crawler and/or search engines." According to Adobe the "translation module may comprise pseudo HTML page code providing a description of the state which omits description of aspects of the state which are not useful to Web crawler and/or search engine. According to the invention application, "cached pages" will reflect a poorly formatted and quite likely partially humanly readable page.