It is great that the robots.txt file is used in sitemap generation, however I would love to see it behave more like Googlebot in this case. According to the [ External links are visible to forum administrators only ], Googlebot supports basic pattern matching in the robots.txt file as well as the use of "Allow:" which I find very useful. I would love to see these included in the script since I use them both in my robots.txt file.
I realize that these are not in the web standards for valid robots.txt files, but many people still use them anyway.