Hello motu_828,
I had over 70,000 pages on my site and it runs great after I configured properly. I now have over 44,000 and it is still running great. I run mine through my cron, crawling about 750 pages with a multi-second gap. My total crawl takes 12+ hours, and I run it every other week. Before I tweaked my settings, I did have issues getting the crawl to finish. But that was years ago and it has ran properly (on the server) ever since. It just ran today with the following results:
Created on: 14 March 2017, 16:39
Processing time: 12:18:50s
Pages indexed: 44266
Images sitemap: 32544 images
Video sitemap: 11 videos
News sitemap: 212 pages
RSS feed: 271 pages
Mobile sitemap: 44266 pages
Pages processed: 44781
Pages fetched: 44778
Sitemap files: 3
Crawled pages size: 3,178.731Mb
Network transfer time: 11:13:20s
Top memory usage: 0.00Mb
Depending on your configuration panel (like CPanel), you might be able to set up a crawl to run once a week without shell / ssh access. That is how I set mine up initially since crawling through the web seemed a little problematic.
You need to tweak the settings to work best on your server. Depending on your settings, this script may place a high load on your server. What memory can you allocate, how much load can be placed onto the server at any given time, how do you want to save sessions, what do you want to log, etc. You mentioned it stopping at the same spot, well it is most likely hitting a memory limit at the same amount of pages each time. Try generating less.. like just the main XML file. Try to also not save things like Referring URLs since that adds time and memory. If your server is running out of resources it is because of the size of your site, your settings, and your server... NOT the software.