Hello,
1,2. According to sitemap protocol, it is only allowed to include URLs form a single domain, so you will have to create 2 sitemaps.
3. Yes, you should allow crawling it.
4. Since sitemap is created from time to time only, it will not affect general server performance.
5. You can setup a scheduled task to create sitemap automatically, say weekly on daily (depending on how actively content changes).
6. It will create multiple sitemap files in case if the limit is exceeded. In robots.txt only asingle entry is required though (sitemap.xml) - the main index file will contain pointers to all other sitemap files.
7. Yes, old URLs will be reindexed.