seo

Adobe recently submitted a US Patent application that relates to SEO for Flash / Flex, titled "EXPOSING RICH INTERNET APPLICATION CONTENT TO SEARCH ENGINES." Believe it or not, this Patent application claims "shadowing" techniques like SWFObject and SOFA are at an "obvious disadvantage" for search. According to Adobe, shadowing textual content in rich Internet applications with textual content in (X)HTML results in duplication and other issues. For those not aware, duplicate content thins keyword relevancy, Google's secret sauce, PageRank and requires a "duplication of effort" in producing "the actual rich Internet application as well as the shadow HTML." This Patent claims site management time is also increased because "changes in the rich Internet application must also be made to the shadow HTML, if that HTML code is to remain consistent with the rich Internet application."

To address these and other issues, Adobe's application proposes an invention that returns different content to users and search engines. According to the Patent application, content will be "available through a rich Internet application to search engine queries" via a "translation module" that interfaces "between a Web crawler and a rich Internet application." It seems this application isn't intended to provide alternative textual "eye wash" for users, but instead descriptions of the state, content and identifying URLs that are "important to Web crawler and/or search engines." According to Adobe the "translation module may comprise pseudo HTML page code providing a description of the state which omits description of aspects of the state which are not useful to Web crawler and/or search engine. According to the invention application, "cached pages" will reflect a poorly formatted and quite likely partially humanly readable page.

Google SSL search launched last week and provides users with an encrypted and secure connection when searching online via https://www.google.com. Secure Sockets Layer (SSL), is the same protocol used for securing a wide variety of Internet services. Like secure shopping carts, this new feature is designed to prevent information from being intercepted either in transit or by third parties. Security is achieved in part due to the fact that SSL turns off the user's browser referrer data. Without referrer data, web analytics can't accurately track user interactions.

Just because you don't see traffic from Google SSL doesn't mean it isn't there...

Matt Cutts first mentioned speed publicly, as a potential ranking signal in November 2009 but, speed has always been important at Google. Google's homepage for example, is intentionally sparse so that it loads quickly. Larry Page recently said he wants to see pages "flip" online. Clearly the concept of speed and it's importance at Google is nothing new. Robert Miller, actually conducted the first research in this area over 40 years ago. According to Miller, when a computer takes more than one tenth of a second to load, the user feels less and less in control. When Google and Bing conducted their own tests in 2008 the results were similar to what Miller had predicted. Bing experienced a 1.8% reduction in queries when slowed by 2.0 seconds and Google experienced a 0.59% reduction in queries when slowed by 400 milliseconds. Bottom line, fast pages help marketers because users are far less likely abandon fast loading sites. Users expect pages to load quickly!

Page Speed was introduced as a ranking signal for English queries executed via Google.com nearly a month ago. It's defined as "the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser." In their first major update to Webmaster Guidelines in over a year, Google recommends webmasters monitor site performance and optimize page load times on a regular basis.

Because Page Speed data is missing from Analytics and other tools, it's best to use the Site Performance in Google Webmaster Tools for regular monitoring and optimization. To view Site Performance data in Google Webmaster Tools, you'll need to add and verify your site first. In Google Webmaster Tools, Site Performance data is processed in aggregate and without personally identifiable information. "Example" URLs are derived from actual Google Toolbar user queries and as a result query parameters are removed. "Suggestions" are based on URLs crawled by Google and not truncated. "Average" site load times are weighted on traffic. Site Performance data accuracy is based on the total number of data points, which can range from 1-100 data points (low) up to 1,000+ data points (high). As mentioned earlier, this data comes from Google Toolbar users with the PageRank feature enabled. While Google hasn't provided Toolbar PageRank user demographic information, this data seems fairly reliable. If anything, it would seem that Toolbar PageRank bias would point to more savvy users and that as a result Google Webmaster Tools Site Performance data might be faster than actual.

During our "Speed" session at SMX, Vanessa Fox and Maile Ohye (Google) seemed to agree that less than 4.0 seconds was a good rule of thumb but still slow. According to Google AdWords, "the threshold for a 'slow-loading' landing page is the regional average plus three seconds." No matter how you slice it, this concept is fairly complex. Page Speed involves code, imagery, assets and a lot more not to mention user perceptions about time. For example, the user's internet connection (Google Fiber), their browser (Chrome), device(Nexus One), its processor (Snapdragon), the site's host and other issues all impact user perceptions about speed. Don't worry though, according to most expert estimates, 80% to 90% of page response time is spent on the front-end. At the end of the day, relevance is critical but speed follows quickly behind.

Speed Resources: