Hello,
did you ever get your problems resolved? I've been on a shared hosted server and my generator stops crawling about every 1-2 minutes. The screen will say the crawl is in process but the pages scanned will never change and show that it is truly continuing. Then if I click on the Crawling tab again it goes back to the page where I click the box to continue crawling if stopped and the box that says to continue the last job. It shows a date and time that was about 2 minutes from when I previously started it- even if that was 1 hour ago. Thus, I know that it has truly stopped.
The help desk at XML stated “it looks like your server configuration doesn't allow to run the script long enough to create full sitemap. Please try to increase memory_limit and max_execution_time settings in php configuration at your host (php.ini file) or contact hosting support regarding this.” XML recommended going to 3000 for time and 128M for memory. But my remote host says “We can set 120 seconds for the execution time and 64 MB for the memory limit.” So, it just keeps stopping. also, my remote host won't allow CRON jobs.
I did see where you were told the following and it helped:
Hello,
I'd recommend to add this in "Do not parse" option first:
Code:
tag/
feed/
Exactly where do you put these tags? On my configuration there are two places called "Do not parse extensions:" and "Do not parse URLs:" but that doesn't seem like the right area. Did you put these in one of these two places?
Any feedback you or an admin can afford me is greatly appreciated.
thanks
Rex