I have recently purchased and installed xml-sitemaps standalone version.
I set all file permissions correctly. I began Crawling and things seemed to go well.
Then i recieved error message : fatal error. so i created .htaccess file and upped the MAX_INPUT, memory_limit, and max_execution. to large numbers. Things continued to work after that.
FInally i walked away and came back the next day.
Accessing the crawling page takes forever. I check box to continue running in background, as well as box for resume crawl from before. This states that about 14,000 pages have been crawled and 9 pages remain.
I can not get the crawler to resume. I have tried to start it in ssh using the command given for cron jobs. This only returns a few lines that look like a php basic page header and the stops. nothing. Please help I need this done asap.