I have the script and it seems to be working correctly in crawling my site, and I have a cron job set up to automatically execute it each night, but it never completes executing in one go, I think it stops after about a half an hour each time:
Updated on 2019-06-03 04:29:39, Time elapsed: 0:29:33,
Pages crawled: 1700 (1583 added in sitemap), Queued: 328, Depth level: 6
Current page: ...
I am fairly certain it has something to do with PHP and its max execution time, but I thought that the configuration screen shows that I can set those parameters for this script:
Maximum pages:
This will limit the number of pages crawled. You can enter "0" value for unlimited crawling.
Maximum depth level:
0
"0" for unlimited
Maximum execution time, seconds:
0
"0" for unlimited
Maximum memory usage, MB:
768
"0" for default. Note: might not work depending on the server configuration.
Save the script state, every X seconds:
30
this option allows to resume crawling operation if it was interrupted. "0" for no saves
Given that I set max execution time to 0, I would not think that it would stop after 30 minutes each time. Can you please clarify as to why this is happening? Thank you.