Hi everybody and the admin. I love the product. But my site has grown. I am facing out of memory issues. I have tried juggling the var_export and other option, I have set the memory in php.ini to 128M. I have filters in the crawl config, so that the total # of urls is something like 25000.
I have read a few posts about this same issue, apparently without any resolution.
And i have tried using command line to
php runcrawl.php
This is the way i am working around it. I have made copies, and feed suburls to the sitemap gen. So each copy has less to crawl. But i have multiple sitemaps now. This is too much of a compromise.
I need HELP !.
Sam
please excuse my login id. I am not admin in anyway.