A dicey proposition at best Bill. I would suggest doing it at the Page level myself, by having your code parse through the query string and slap a meta robots in there when your case is matched.
Here's the problem with trying to do the same thing via robots.txt
1. The actual robots.txt standard doesn't care about or look at query strings. Some of the search engines do, but this "add-on" to the actual specs differs from engine to engine. So what works on one engine may cause some serious issues with another engine.
2. Something is definitely mixed up somewhere. It would behoove you to track down the root cause of the mix up before attempting to patch it via robots.txt. The spiders aren't normally going to flip flop variables like you've outlined, and normally people won't link to something different than what they get in their address bar. So you may first want to run something like Xenu Link Sleuth against the site to see if you have an internal link or two that's swapping around the variables.
3. By the robots.txt standards there is an implied wildcard at the end of each disallow line.
Now if you were asking about Googlebot specifically, they do offer pattern matching and have stated that the $ character can be used to stop the automatic wildcard that's natively a part of robots.txt. So in your example you could --in theory-- do something like:
The above however is open to a lot of misunderstanding on the part of the spiders. For instance, it would diallow something like /index.asp?Category=123&AnotherVar=Something&Pageaction=VIEWCATS. So it may end up disallowing pages you don't want it to. And it's Google only. Goodness knows what Yahoo!, MSN or any of the rest might make of it.
If it were me I'd first want to identify where the variable flip flop is happening and why, then fix that first.
And if I really , really wanted to get rid of the duplicates (which isn't all that big of an issue to begin with 90% of the time) I'd do it at the page code level. Using robots.txt seems too risky to me, especially when using non-standard instructions.
<edit to add>
Jill types shorter.
That's one I would worry about Jill. The implied wildcard at the end could catch up all sorts of pages that one wouldn't necessarily want to exclude.