Ok, well my problem is simple. I have a site with a lot of pages (1million+). So i went ahead and created the sitemaps 1, 2, 3, 4, 5, and so on, about 20 sitemaps. The problem is when the cron job runs, which runs just fine btw, it crawls about 66,000 pages and distributes them between the first 3 different sitemaps correctly taking into consideration that every file has a limitation of 40000 or 9mb. My problem now is when the cron job runs the next day, it is replacing those first three files and its not resuming or adding new pages to other new sitemaps I created such as sitemap3.xml, sitemap4.xml and so on. When I check Google it show that I only have 66,000 and this prevents my other pages from being indexed. So my question is what option do I use to resume the crawling on the last used file and stop the replacing of existing sitemaps? Thank you