Hi,
new to this today and I have the same problem as this - and noticed its also reported in many other posts.
The answer always seems to be a private PM to the customer.
Is there a public fix for this that we can all share?
My problem is that it stops at only 784 pages, and when I kick it in the guts again it restarts about 30 less than the number it stopped at, works its way up to that number and then stops again.
I solved it by resetting the number of pages to 784 and got it to write a sitemap for me but this is no good as I need to get up to 50,000.
Also , whilst on this subject, is there a way - or could there be a way, of getting this to write the sitemap after every 50 or 100 pages so that when the thing kicks the bucket we at least have a sitemap up to the last 50 pages? Seems a lot of waste of time and resources to go through this only to find you stop at a certain point and have nothing to show for it.
regards
ET
I'm also having a similar problem. The crawler adds around 42,000 pages to the sitemap with over 200,000 pending in the q, the crawler stops around that point. when i go to restart it, it starts from the beginning. Yes, I selected all options when I restarted everything. I've done this a number of times before in the past and I never had this problem before.
php.ini file settings
max_execution_time = 9000 ; Maximum execution time of each script, in seconds
max_input_time = -1 ; Maximum amount of time each script may spend parsing request data
memory_limit = 1024M ; Maximum amount of memory a script may consume (8MB)
progress state storage type: var_export
Save the script state, every X seconds: 180