Hmm – turns out that was the max time configuration in the script's config. Apparently some part of the script obeys it when executing from the command line.
New problem, though:
Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 52635936 bytes) in /home/webadmin/targetvacations.ca/html/generator/pages/class.utils.inc.php(2) : eval()'d code on line 29
This comes when I configure a maximum memory amount for the script (in this case, specifically 256MB). Should the script not be able to handle this situation and chunk the data appropriately? Having seen this message after crawling is complete regardless of what amount of memory I allow it to use (first 32MB, then 100MB, now 256MB), my suspicion is that part of the script that is checking to see how much it should use, then trying to behave appropriately is failing.
Keep in mind I'm running this from command line, *not* the web interface. The last line from the crawl script before it dumped out its array of pages was:
111080 | 4896 | 3,351,548.8 | 8:28:30 | 0:22:24 | 5 | 128,379.9 Kb | 108911 | 80527 | 69
Any help you can offer would be greatly appreciated – any thoughts as to how to get this to finally work and, perhaps more importantly, how to resume after it has failed in this way (as I am getting somewhat tired of having to wait 8+ hours to see whether the script will work).
Thanks.