... where /page.cfm?id=123 shows that this information can change every second and though it still ranks, it may lack in higher rankings.
Sorry, but that simply isn't true, and hasn't ever been true since search engines first started indexing dynamic pages.
The search engines know that most dynamic pages don't change all that much unless the parameters change (in which case, they no longer consider it the same page). In the rare instance where a dynamic page does change frequently, such as a news site, the search engines simply visit more often and, indeed, frequently rank the page very highly.
Software that emulates static pages can arguably have indirect affects on SEO, but those are only through benefits accrued to the visiter. As such, there's no reason not to give the users of your software the option to use traditional dynamic URLs or seemingly static ones. But, in my opinion, it MUST be a choice because you are right at least about one thing -- the search engines do
react differently to dynamic pages than to static page, not in ranking as you erroneously suggest, but rather in crawling.
When a spider sees an obviously dynamic URL, it will intentionally slow down so as not to overwhelm the server with page requests. It knows the server is working much harder on those dynamic pages than it would on static pages, and it paces itself accordingly. Conversely, the spider can very quickly bring a server to its knees on even a moderately busy web site when you fool it into thinking all the pages are static.
As usual, there is no one-size-fits-all solution. There is no this-is-good or this-is-bad. Any site can rank highly with a well designed dynamic URL. Not every site, however, can rank (or even survive) with a poorly designed static URL. Good software will offer choices, not ill advised search engine myths.