Hello,
Sitemap Generator script doesn't have aseparate option to limit crawling depth additionally to existing maximum number of URLs limitation. If you want to further limit the pages list, you can use "Do not parse URLs" and "Exclude URLs" options for that.
Every time you execute the generator, it starts crawling from the start since it has no information on what parts of your site has been changed and it can be obtained by the full scanning only. You can setup the crawler to run less often though - a weekly sitemap refresh is fine, for instance.
My limited understanding is that I should have all my links in the sitemap. Is this true?
Yes, the idea of the sitemap is to include all your links to it to make your site easier to crawl for search engine bots.