Hi,
I have 60 million pages for my site. I guess each sitemap has max 50,000 links, so 60 million pages would request at least 1,200 sitemaps. An index page will index max 1,000 sitemaps. So I need at 2 index files.
The problem was started from the size of each sitemap. Google only allow 10mb for each sitemap. So I set the line to 'xs_sm_size' => '30000', therefore each sitemap is about 30,000 links and only 6.5mb.
I thought everything was set up correctly, but after I clicked "crawl" button manually, the server hanged for about 1 hour and started to crawl for 2 hours, and after 30,000 links the generator started to crawl the first sitemap again (duplicated 1)? It never started to carry on the next sitemaps. So I expected it to crawl 1200 sitemaps and it has been run for a week. I only got 1 sitemap which is 30,000 links only.
Please help. I can pay anyone USD $100 to get me things fixed. Based on my sitemap size, i need 30,000 links per sitemap, total 2,000 sitemaps and 2 index files.
I would appreciated it if anyone helps me. $$$
Regards,
Lee