Thanks! That worked. However, this sitemap has been crawling for 2 days now with me resuming it twice. It says it has scanned 38,680 pages so far. I just do not believe we have that many pages and perhaps there might be duplicate content because this has happened to us in the past before with other software. How do we go about tweaking it so that we filter out the duplicate pages?