1.) The error is displayed under the Configuration tab:
An error occured
Sitemap file is not writable: /home/user/public_html/sitemap463.html
Sitemap file is not writable: /home/user/public_html/sitemap464.html
(etc, etc., all the way to:)
Sitemap file is not writable: /home/user/public_html/sitemap559.html
As you can see from my sitemap, those pages seem to have been written OK:
[ External links are visible to forum administrators only ]
2.) That is what I'm talking about; I set 7200 seconds as Max Execution Time under Configuration -> Crawler Limitations. I start the crawl via cron at 0100, and when I check on it at 1100 I see that it's still running. I don't want it to run that long, because it bogs the site down the whole time it's running, making it barely navigable to the user, not to mention search engine bots.