It is a known issue that if you have a robots.txt file that lists directories that you don't want HUMANS to visit, then it can be a security risk, simply because it's a list of all the directories that you are trying to keep people away from.
robots.txt is not security - it simply keeps robots away from certain areas of your site (at least, those robots that comply with the directive). To keep humans away, you need to implement actual file and directory level permissions.
Having said that, implying that the mere existence of a robots.txt file is a security risk is going too far. It's only a security risk if you list secret folders in it. If you have a robots.txt file that just says "allow robots" then there is no security risk at all. Additionally, if your robots.txt disallows areas like the cgi-bin and images folder, there is unlikely to be any issues, since you'd have to store your secret stuff in those folders for it to matter.
In short, the security assessment is correct as far as it goes, but IMO unless they have actually looked at the robots.txt file and found directories that should not be in there, then the warning seems to me to be more of a marketing ploy than an honest assessment (kind of like a sleazy SEO warning you that your website can't be found in some directories, omitting the fact that the directories are full of spam or owned by the SEO themselves).
A good security check does check the robots.txt file. But the existence of a robots.txt file by itself is NOT a security threat. It's the actual content that might be an issue.
Finally, I could list every secret directory in my file right up front without any problems, as long as I had file level security on them. The warning is only valid for unprotected directories, at which point I'd say that you have more serious problems than a robots.txt file.
A classic case is the robots.txt file for the US Whitehouse: http://www.whitehouse.gov/robots.txt
You'll see that it's huge and lists all sorts of juicy, apparently hacker-happy information about the site structure. But if you look closer, it basically only disallows indexing of the text-only version of webpages (preventing duplicate content) and a few other minor directories. Those same directories are open to the public and are not a security risk, because they are not dumb enough to put top-secret documents on the public website. I imagine that this robots.txt file would be flagged with code-red urgent paranoid warnings by this same security company, but no doubt the warnings would get nothing but a laugh by Whitehouse security.
Yes, it should be checked during a security audit. No, it's not usually a problem. Certainly not if it's a "standard" robots.txt file. It may only be an issue if someone got confused and tried to implement security with a robots.txt file, which is very rare. Like I said, they are not wrong for checking, but they should check further before sounding out a "sky is falling"-type warning.
I hope that helps.