OK I am running the SG on a new dynamic site which isn't released yet, so should not be more than 20,000 odd pages at this time. I ran SG a few times with cautious config settings to gauge how long it may take to map the whole site. I had some memory issues a couple times so changed the settings and fixed them. It was doing 5000 pages in reasonable time. So then I lifted the restrictions on the number of pages to crawl and let it loose as a background process.
It's now 3 days later and the crawl button is still not visible on the crawl page - I just see "[checkbox] Do not interrupt the script...", but no button. So I guess it's still running, and may have gone out of control crawling the whole internet or stuck in a loop some where or something.
I stopped using SM a couple years back because I had a lot of similar issues running it on two two very large sites, and it started overloading the server on occasions.
I don't have access to reboot the serber this site is on, so how can I abort so that I can re-adjust settings and run again? I tried renaming the script directory for a while, hoping it would crash the process and reset, but that didn't work. Is there a CLI I can SSH into to monitor, control or abort the process? If so where can I find out about usage?