Apart from putting in the contents of my robots.txt file, I added & as not to be parsed and URL's to exclude. This stripped a result of approx 5000 URLs down to about 1500 - which at a guess shoulld about cover everything at least once at this point, and is about 50% higher than Google's current indexed pages for my site.
I'm counting on 2 factors in doing this:
Google is still going to crawl additional links from the pages found in the sitemap.xml , and
the showthread?p= is going to cover subsequent pages of multipage threads.
I take the view that Google will only index as much as they deem fit - the goal of the sitemap is to make sure they index the right stuff.
Hopefully I'm not too far off the mark.