design

Does Google's removal of the following paragraph mean that creating static copies of dynamic pages is no longer necessary?

"Consider creating static copies of dynamic pages. Although the Google index includes dynamic pages, they comprise a small portion of our index. If you suspect that your dynamically generated pages (such as URLs containing question marks) are causing problems for our crawler, you might create static copies of these pages. If you create static copies, don't forget to add your dynamic pages to your robots.txt file to prevent us from treating them as duplicates."

http://www.google.com/support.....py?answer=40349&ctx=sibling

Until today, Google suggested creating static versions of dynamic pages. Reason being, Googlebot had difficulty in crawling dynamic URLs especially URLs containing question marks and/or symbols. To prevent duplicate content issues caused by sites with both static and dynamic versions, Google suggested "disallowing" the dynamic version via robots.txt. While this tactic helped engines, some would say using both thinned PageRank as well as the relevancy of anchor text in inbound links. Either way, it will be interesting to see how this move impacts Flash sites with a "static" version. Safe to say, Google is quickly advancing in the ability to crawl content!

By now you probably know Google indexes text content within Flash thanks to Google's new Algorithm for Flash. In case you missed it, Google recently updated their original announcement to include additional details about how Google handles Flash files.

SWFObject - Google confirms that Googlebot did not execute JavaScript such as the type used with SWFObject as of the July 1st launch of the new algorithm.

SWFObject - Google confirms "now" rolling out an update that enables the execution of JavaScript in order to support sites using SWFObject and SWFObject 2.

According to Google, "If the Flash file is embedded in HTML (as many of the Flash files we find are), its content is associated with the parent URL and indexed as single entity." I found this isn't the case using a variation of the example used by Google. The following query finds the same content indexed at three URLs 2 SWF and 1 HTML:
http://www.google.com/search?q=%22NASA%27s+Hubble,+...

http://www.jpl.nasa.gov/multimedia/deep-impact/index.swf
http://www.nasa.gov/externalflash/deepimpact_flash/index.swf
http://www.jpl.nasa.gov/multimedia/deep-impact/index-flash.html

Additional:

Deep Linking - Google doesn't support deep linking. "In the case of Flash, the ability to deep link will require additional functionality in Flash with which we integrate."

Non-Malicious Duplicate content - Flash sites containing "alternative" content in HTML might be detected as having duplicate content.

Googlebot, it seems still ignores #anchors but will soon crawl SWFObject. Given that Googlebot can or will soon crawl SWFObject sites, major reworks should be considered for "deep linking" sites where correlating "alternative" HTML content pages contain the same Flash file and are accessible via multiple URLs.

ActionScript - Google confirms indexing ActionScript 1, ActionScript 2 and ActionScript 3 while at the same time Google shouldn't expose ActionScript to users.

External Text (XML) - Google confirms, content loaded dynamically into Flash from external resources isn't associated with the parent URL.

While this is a great development for Flash Developers moving forward, lots of education may be required.

Sites claiming to offer a new, innovative solution for "Flash SEO" called SWFAddress aka "Deep Links", "Deep Linking" and/or other. Unfortunately, these sites are promoting techniques based on SWFAddress which is a method for Flash SEO that I've blogged about, taken the creator to task on and that even he admits, is sub-optimal in terms of SEO!

"The case is valid. Deep links with anchors published on other sites will tell Google to index the start page."
- Google Groups

Not to worry though because identifying sites using SWFAddress is easy! If a Flash site uses #anchors (a pound sign) in it's URLs chances are it's using SWFAddress. The problem with this SWFAddress is that it functions in only one direction, or so to speak.

Google ignores the #anchor in SWFAddress URLs as well as the entire path following the #anchor in URL. When users with Flash cut and paste a link from their address bar into their blog, digg and/or Linkedin, Google ignores everything starting with the #anchor and as a result misallocates keyword relevancy and PageRank to the "start page".

Some credit where it's due would have been nice but, either way I commend the good folks at Asual for their efforts as well as the new "COPY LINK TO CLIPBOARD" link in the footer of there SEO sample pages.