Hello,
We have received your email. We have started the procedure long time before. process is still running on. We again got same Fatal Error, saying Memory limit reached. Almost 160,000 urls are fetched at depth level 16. When I got error I changed the crawler settings and set depth level to 16. But still crawler started the depth level 17. Here we want to stop crawler and want to generate sitemap for all fetched Urls till now. How should i do this? We dont want to start the procedure again. Is there any way around?
Thanks,