Google made "asynchronous" the new marketing industry BUZZ word for 2010 when they rolled out an asynchronous version of Google Analytics. Asynchronous scripts are still just scripts after all and not bulletproof. Asynchronous Google Analytics isn't an open license to do as you please. Over the past year, I've noticed a major increase in the number of "mavericky" asynchronous Google Analytics implementations. When implemented properly Google Analytics is a great tool but implementation is critical.
Bottom line, "mavericky" implementations can actually have a negative impact on user experience. Even worse, this data can be missing from both analytics and the Google Webmaster Tools site performance tab depending on how onload event firing is impacted. Oh yeah, and don't forget rankings! Matt Cutts said, Google Analytics doesn't impact rankings because when properly implemented it waits to load scripts until after the onload event but, that may not be the case if improperly implemented. Maile Ohye has confirmed that one of the ways Google calculates performance is via the onload event. According to Google, "To ensure the most streamlined operation of the asynchronous snippet with respect to other scripts," the asynchronous snippet should either be placed just before the close of the HEAD tag or just before the close of the BODY tag in your (X)HTML document. I'd suggest not taking any chances this Holiday season because this year speed is more important than ever before and testing.
Adobe recently submitted a US Patent application that relates to SEO for Flash / Flex, titled "EXPOSING RICH INTERNET APPLICATION CONTENT TO SEARCH ENGINES." Believe it or not, this Patent application claims "shadowing" techniques like SWFObject and SOFA are at an "obvious disadvantage" for search. According to Adobe, shadowing textual content in rich Internet applications with textual content in (X)HTML results in duplication and other issues. For those not aware, duplicate content thins keyword relevancy, Google's secret sauce, PageRank and requires a "duplication of effort" in producing "the actual rich Internet application as well as the shadow HTML." This Patent claims site management time is also increased because "changes in the rich Internet application must also be made to the shadow HTML, if that HTML code is to remain consistent with the rich Internet application."
To address these and other issues, Adobe's application proposes an invention that returns different content to users and search engines. According to the Patent application, content will be "available through a rich Internet application to search engine queries" via a "translation module" that interfaces "between a Web crawler and a rich Internet application." It seems this application isn't intended to provide alternative textual "eye wash" for users, but instead descriptions of the state, content and identifying URLs that are "important to Web crawler and/or search engines." According to Adobe the "translation module may comprise pseudo HTML page code providing a description of the state which omits description of aspects of the state which are not useful to Web crawler and/or search engine. According to the invention application, "cached pages" will reflect a poorly formatted and quite likely partially humanly readable page.
There has been lots of talk about the 8GB card in HTC's new Google EVO (two different cards were provided to IO attendees). It will be interesting to see how the HTC EVO is received Friday when the device finally hits store shelves. I've been using my EVO for a few weeks now and really like it. I've tried to replicate the SD card issue mentioned by others but so far haven't been able to on my EVO no matter what I try. Maybe it's Windows related not sure but, either way I wouldn't be too concerned .
EVO is the first 4G mobile device and sports the supersonic 1GHZ Snapdragon processor. It's so fast it needs a kickstand and has one build in the back. While EVO is a phone and not a tablet (though nobody can provide a clear definition) it's pretty large. EVO's battery in fact is largest I've seen in an HTC yet. I like the hot spot feature and look forward to having 2.2 up and running on my EVO. My favorite feature is EVO's camera, it can take photos from either side of the phone. If you need a new phone, check out EVO.
Google is back in Atlanta, GA making Street View images for Google Maps but, this time they brought in the big guns. The cars here today are equipped with GPS, high resolution panoramic cameras and multiple SICK sensors. These sensors collect LiDAR data that can be used for 3D imaging and visualizations like that seen in Radiohead's recent "House of Cards" music video. Google Earth and SketchUp, Google's 3D virtual building maker for Maps also use this type of data.
Last week Google announced the release of a plugin which allows users access to Google Earth imagery via Maps. As a result it's now possible to view 3d images in Google Maps. The problem here is fairly obvious, Google Earth's aerial imagery is taken from above and as a result not from the same perspective as users interacting with the data. Not to worry though, the StreetView team has been working on these kinds of problems for sometime. When it comes to Navigation, Maps or StreetView, earthbound LiDAR enhanced imagery processed via Sketchup seems like a perfect complement to Google's existing view from above. Combining high resolution imagery taken from the user's perspective with advanced 3D image technology, presents some new possibilities to say the least. Factor in new releases like business ads in Maps, now being available in 3D on your mobile device and it's pretty clear how Sketchup will be monetized.
Matt Cutts provided some interesting details about where the industry is headed, last week at PubCon.
During the "Interactive Site Review" session, Matt suggested investigating the history of each domain name you own or plan to purchase. He suggested avoiding domains with a shady history and dumping domains that appear to have been burned in the past. To investigate the history of a domain, Matt suggests Archive.org. Matt said, blocking Archive.org via robots.txt is a great indication of spam when already suspected.
Matt mentioned speed several times. During the "Interactive Site Review" Matt said that webmasters need to pay more attention to speed. He pointed out that landing page load time factors into AdWords Quality Score and said speed will be a big trend in 2010. During Matt's "State of the Index" presentation, he pointed out Google's tools for measuring page speed and even mentioned webpagetest.org a third party tool. According to Matt, Google is considering factoring page load speed into rankings. Matt said, that Larry Page wants pages to flip for users on the internet. He illustrated this point with Google Reader's reduction of pages from 2mb to 185kb. Nothing official yet but, something to keep an eye on for sure!
During Q&A for "The Search Engine Smackdown" session Matt explained Caffeine as being like a car with a new engine but not an algorithm change. Matt said, Caffeine will help Google index in seconds and that it should be active within a few weeks on one data center. That said, Caffeine won't roll out fully until after the holidays. Matt pointed out that Google is built for load balancing and for that reason isolating individual IPs for Caffeine testing access is difficult. Matt also mentioned that AJAX SERPs and Caffeine aren't related but that Google will continue testing AJAX SERPs.