Hello,
1. The crawling time itself depends on the website page generation time mainly, since it crawls the site similar to search engine bots.
For instance, if it it takes 1 second to retrieve every page, then 1000 pages will be crawled in about 16 minutes.
Some of the real-world examples of big db-driven websites:
about 35,000 URLs indexed - 1h 40min total generation time
about 200,000 URLs indexed - 38hours total generation time
2. you should submit sitemap.xml in google webmaster account: [ External links are visible to logged in users only ]
3. you can regenerate sitemap from time to time, or alternatively - setup a scheduled task (cron job) in your hosting control panel to automatically recreate sitemap.