seo

Malcolm Coles pointed out a new feature in Google SERPs and posed some interesting questions last week. I don't think Google is treating "brand names" as site operator queries. Site operator queries only return results for a single site. Either way, both Malcolm's and Matt's examples appear to be navigational and/or what are referred to as "named entity queries."

Queries provide numerous signals that engines can use for insight about user intent. They are for the most part either informational, navigational or transactional (action) in intent but, some queries fall into more than one category. These queries are often classed as named entities. Problem is, it's difficult to surmise intent from a single query that may have multiple interpretations. Google already has related patents and recently purchased Metaweb, a company specializing in this field. One aspect that I haven't seen mentioned elsewhere in plain English, is that company names, product names, organization names, brand names and/or combinations thereof are named entities. Named entities are easy to extract online because they are often capitalized. If leveraged properly, they could provide a number of associative signals that are well worth considering.

All that said, I'm not sure that is what is happening today. When statistic probability significantly favors one site over all others in terms of user intent, it makes sense that engines would return multiple results for that site instead of just two. Google may have introduced named entity elements or may simply be handling navigational queries in a way that seems... well more logical.

Adobe recently submitted a US Patent application that relates to SEO for Flash / Flex, titled "EXPOSING RICH INTERNET APPLICATION CONTENT TO SEARCH ENGINES." Believe it or not, this Patent application claims "shadowing" techniques like SWFObject and SOFA are at an "obvious disadvantage" for search. According to Adobe, shadowing textual content in rich Internet applications with textual content in (X)HTML results in duplication and other issues. For those not aware, duplicate content thins keyword relevancy, Google's secret sauce, PageRank and requires a "duplication of effort" in producing "the actual rich Internet application as well as the shadow HTML." This Patent claims site management time is also increased because "changes in the rich Internet application must also be made to the shadow HTML, if that HTML code is to remain consistent with the rich Internet application."

To address these and other issues, Adobe's application proposes an invention that returns different content to users and search engines. According to the Patent application, content will be "available through a rich Internet application to search engine queries" via a "translation module" that interfaces "between a Web crawler and a rich Internet application." It seems this application isn't intended to provide alternative textual "eye wash" for users, but instead descriptions of the state, content and identifying URLs that are "important to Web crawler and/or search engines." According to Adobe the "translation module may comprise pseudo HTML page code providing a description of the state which omits description of aspects of the state which are not useful to Web crawler and/or search engine. According to the invention application, "cached pages" will reflect a poorly formatted and quite likely partially humanly readable page.

Google SSL search launched last week and provides users with an encrypted and secure connection when searching online via https://www.google.com. Secure Sockets Layer (SSL), is the same protocol used for securing a wide variety of Internet services. Like secure shopping carts, this new feature is designed to prevent information from being intercepted either in transit or by third parties. Security is achieved in part due to the fact that SSL turns off the user's browser referrer data. Without referrer data, web analytics can't accurately track user interactions.

Just because you don't see traffic from Google SSL doesn't mean it isn't there...

Matt Cutts first mentioned speed publicly, as a potential ranking signal in November 2009 but, speed has always been important at Google. Google's homepage for example, is intentionally sparse so that it loads quickly. Larry Page recently said he wants to see pages "flip" online. Clearly the concept of speed and it's importance at Google is nothing new. Robert Miller, actually conducted the first research in this area over 40 years ago. According to Miller, when a computer takes more than one tenth of a second to load, the user feels less and less in control. When Google and Bing conducted their own tests in 2008 the results were similar to what Miller had predicted. Bing experienced a 1.8% reduction in queries when slowed by 2.0 seconds and Google experienced a 0.59% reduction in queries when slowed by 400 milliseconds. Bottom line, fast pages help marketers because users are far less likely abandon fast loading sites. Users expect pages to load quickly!

Page Speed was introduced as a ranking signal for English queries executed via Google.com nearly a month ago. It's defined as "the total time from the moment the user clicks on a link to your page until the time the entire page is loaded and displayed in a browser." In their first major update to Webmaster Guidelines in over a year, Google recommends webmasters monitor site performance and optimize page load times on a regular basis.

Because Page Speed data is missing from Analytics and other tools, it's best to use the Site Performance in Google Webmaster Tools for regular monitoring and optimization. To view Site Performance data in Google Webmaster Tools, you'll need to add and verify your site first. In Google Webmaster Tools, Site Performance data is processed in aggregate and without personally identifiable information. "Example" URLs are derived from actual Google Toolbar user queries and as a result query parameters are removed. "Suggestions" are based on URLs crawled by Google and not truncated. "Average" site load times are weighted on traffic. Site Performance data accuracy is based on the total number of data points, which can range from 1-100 data points (low) up to 1,000+ data points (high). As mentioned earlier, this data comes from Google Toolbar users with the PageRank feature enabled. While Google hasn't provided Toolbar PageRank user demographic information, this data seems fairly reliable. If anything, it would seem that Toolbar PageRank bias would point to more savvy users and that as a result Google Webmaster Tools Site Performance data might be faster than actual.

During our "Speed" session at SMX, Vanessa Fox and Maile Ohye (Google) seemed to agree that less than 4.0 seconds was a good rule of thumb but still slow. According to Google AdWords, "the threshold for a 'slow-loading' landing page is the regional average plus three seconds." No matter how you slice it, this concept is fairly complex. Page Speed involves code, imagery, assets and a lot more not to mention user perceptions about time. For example, the user's internet connection (Google Fiber), their browser (Chrome), device(Nexus One), its processor (Snapdragon), the site's host and other issues all impact user perceptions about speed. Don't worry though, according to most expert estimates, 80% to 90% of page response time is spent on the front-end. At the end of the day, relevance is critical but speed follows quickly behind.

Speed Resources:

Live blog from AIMA Atlanta, GA SEO event with Joshua Palau (RazorFish.com), Vanessa Fox (NineByBlue.com), moderated by Lee Blankenship (SearchDiscovery.com). We'll be starting in a few minutes…. This is a live blog so please excuse typos and spelling errors :)

- Search is Search.
Majority of US searchers don't understand the difference between paid and organic so we should treat SEARCH that way. You don't have a paid or organic strategy but rather a search strategy.

How do I force change?
- Search has evolved from being confined to 10 blue links.
- As an organization you don't have to explain, just search.
- Everything you do needs a search strategy because it is what your customers demand.
- Stop treating these as silos.

Instead of forcing change it's possible to wait and adapt to change, for example folks in Atlanta will search for ADT no matter what so, what are the critical success factors?

Critical Success Factors for Search
- Analytics
- Belief and Patience
- Executive Leadership

Where should I start to work and how can we work together.
Planning
-shared KW research strategy
-message consistancy
-strategy

Execution
- learn optimize response in paid and organic
- competitive monitoring

Optimization
-group performance
- top paid vs natural by rank and conversion

Business objectives Measurement Strategy Competive Analysis should be your focus. For reporting success, set up a score card that shows "everything". Understand and explain how search fits into the digital strategy as well as the broader strategy.
Thanks Joshua!

Up next is Vanessa Fox....

Step back and focus on strategy. Changing search results provide a range of opportunity, local, news, maps, video and even Rich Snippet results. Vanessa mentions negative suggestions in suggestion box. As search becomes more the primary navigation it's important to think about search navigation. Vanessa illustrates the pattern map that we all do even when not aware. This becomes important when advertising because of words being used in search queries by users.

Instead of focusing on user intent Vanessa suggests personas to determine calls to action and

How does all this tie into life?

(edityourown.com) was a social media campaign during the Superbowl in 2009 but users searched for "edit your own" but the site in question didn't rank for that result. In addition, little was spent on PPC for this site and as a result this ad drove traffic to sites other than the one desired.

This year's Bridgestone Superbowl ad included a link to bridgestonetire.com/superbowl but didn't optimize for these terms or buy ppc that keeps with branding. As a result opportunity was missed.

The big example though was the Dockers pants ad during the Superbowl. In this case users conducted navigational searches for the URL and not the kw. Problem being, the URL was a redirect and not an actual page. As a result spam marketers began implementing URLs with the domain name used by Dockers to intercept traffic.

Another example was 2012, a movie. In this case billboards seem to have been driving traffic to sites other than the advertiser.

Bottom line, every campaign needs to include search strategy...

Up next questions:

Vanessa Fox - Google for a long time has wanted to make pages faster and what they have found is that slow pages increase user experience. It's not that speed will boost rankings but could be a negative if pages are slow.

Joshua - At the core of good organic is good for the users. If sites are slow users will leave but he can't say for sure if this will impact rankings.

Lee asks if Mobile impacts speed.

Vanessa - isn't sure if mobile is part of speed.

First question from Delta:
How can brands prepare for campaign in advance?

Vanessa - plan, consider a microsite or other depending on situation.

JP agrees that planning is big but for companies this shouldn't be difficult due to crawl rate.

VF suggests checking server logs to establish crawl rates and make determinations based on that data in order to see spikes.

Next question from weatherchannel.com - What impact do you see microformats having in future.

VF- Google uses markup in snippets but doesn't use as a ranking factor due to scale. Using microformats for ranking could harm smaller sites.

JP- 4 stars is better than 3 stars and that is good marketing.

VF- sites with structured markup can have a 15% increase in click through from organic SERPS.

Next question, what is the best way to get up to speed on analytics?
VF - suggests kaushik.net/avinash/ and his two books.

DON'T LIKE NEXT QUESTION, so skip it.....

Talk about what engines know about the intent of a searcher….
VF- Engines have lots of data about intent, this is where there seems to be a divide. Google has so many search that they can assess searcher intent. By offering options to login Google can provide relevant results.
JP- Engines have more data than agencies which provide advantages in terms of targeting intent based on user query. While it's great this info is available is it sometthing users want? That is the question it seems...

Next question, an old professor got us thinking about where previous inquires where posted. According to the user Google couldn't provide the answer. Question is about Google Buzz and what impact it has on marketing.

JP - I don't see Buzz getting much better. He feels that Facebook has won.

JP- isn't confined by engine. He says users cant be forced into one spot.

Vanessa - (Youtube number tow engine and would suggest posting video to it because of Google SERPS) Google is trying to pull social results into main SERPs.

Next question about social impact and wall garden SEO.
VF- Social media is opening more and as a result that lives on.
JP- feels the same.

What do you think about Microsoft Bing deal?.
JP- says he feels like innovation had slowed and forced folks to move forward. At the same time he doesn't believe the answers are better and doesn't believe Bing will push Google off the top of the hill.

VF- Technical concerns, Yahoo has lost folks and integration will be a complex task. She doesn't see how yahoo can continue unless microsoft rebuilds.

What does it take to change search habits?
VF - text search is nearly solved. Why would folks want to change? We use search so much we start to apply same techniques in other places. For example Google Goggles a search you don't think of as a search.

JP - What Google can use from Tiger Woods. You may be all things but you'll eventually screw up. He thinks privacy is a big place for Google to screw up.

Why couldn't Google beat Microsoft?
JP - asks for a show of hands for who uses Bing. No hands! Even with money it's harder to pull the numbers...

VF- point out 1/2 market share increase even with all Microsoft is doing.

How is Google indexing sites with Flash?

VF- explains anchor URL issues....