Hi,
thanx, but these are all dynamic pages with a header template. the site has over 1.000.000 pages and i have already excluded the 'example.php' in robots.txt but then google will list all these pages in the 'error section' which is not the most terrible thing, but it would be nice if pages that are excluded for google via robots weren't in the sitemap, plus it would save some kb's in sitemap.
maybe for future releases; 'do parse but do not include in sitemap' ?
kind regards
ps. i am really satisfied with your software, it crawls these allmost 1.000.000 pages without problems,really cool.