I just bought your product yesterday. It's running on one of my sites now. It's been going for about 9 hours and has added 76,000 pages to the sitemap.
It's at link depth 10 (and depth 11 is queued). This is a little confusing, as I know for a fact there is no page within this site that requires 10 clicks to get to. I doubt any page takes more than 4-5 clicks to get to from the homepage.
Now there is probably some kind of loop going on and I'm sure you have "duplication code" in place to prevent double url's from being added to the finished sitemap. A couple of questions:
1. If I stop the scan now, I don't think it will generate the sitemap from what it has already scanned. Is there a way to do this? Stop the scan and force it to generate now?
2. Even though it's at link depth 10, I'm carefully watching the stat "Pages added to sitemap" and it is growing.. Slowly, but pages are still being added. I'm not quite sure to make of this when taking link depth into consideration.
Here's the current stats:
Links depth: 10
Pages added to sitemap: 75973
Pages scanned: 108460 (2,935,449.2 KB)
Pages left: 58232 (+ 26711 queued for the next depth level)
Time passed: 540:21
Time left: 290:06
Memory usage: 99,276.4 Kb
Any comments or suggestions?
CT
[ External links are visible to forum administrators only ]