Okay, then if I understand it correcty, when a spider goes through the site, it doesn't look at the first page, collect all the links, then hit all those links, collect them all, then hit those, etc. eetc....
It just goes down random link paths?
So an xml sitemap would say to a spider, "hey, at least find These links and index them?
Just asking.