Does Google's removal of the following paragraph mean that creating static copies of dynamic pages is no longer necessary?
"Consider creating static copies of dynamic pages. Although the Google index includes dynamic pages, they comprise a small portion of our index. If you suspect that your dynamically generated pages (such as URLs containing question marks) are causing problems for our crawler, you might create static copies of these pages. If you create static copies, don't forget to add your dynamic pages to your robots.txt file to prevent us from treating them as duplicates."
Until today, Google suggested creating static versions of dynamic pages. Reason being, Googlebot had difficulty in crawling dynamic URLs especially URLs containing question marks and/or symbols. To prevent duplicate content issues caused by sites with both static and dynamic versions, Google suggested "disallowing" the dynamic version via robots.txt. While this tactic helped engines, some would say using both thinned PageRank as well as the relevancy of anchor text in inbound links. Either way, it will be interesting to see how this move impacts Flash sites with a "static" version. Safe to say, Google is quickly advancing in the ability to crawl content!