To disallow specific directories or files, use the following code:
If I was a search engine spider, the above Robots.txt would immediately make me suspicious. On finding this the spider has three options:
(i) Obey the robots.txt file and potentially allow a page that may have used a CSS file to produce techniques such as white on white text or shifting a layer (eg with Div tags) over the top of text.
(ii) dissobey the robots.txt file in order to look at all the code each page.
(iii) Give each of the pages the 'kiss of death' flag so that they won't appear in the SERPs.
I think that the third is the easiest option. Even if they haven't done that yet, it is probably only a matter of time before they do.
Do you all agree on this?