Oleg-
Thanks for the reply. I think you are right in that a 'partial site map' is better then 'no site map'.
I talked to the web host that we have our dedicated virtual server with...curious about how high I could set the max memory in php.ini. I think I could crank it to a seemingly ridiculous amount like a gig or more. Then I might not run out of memory in the site mapping. (That is what finally happened in yesterday's attempt after my first post...fatal error...exceeded my 500MB limit I set in the php.ini)
As I watched the site map build I was confused at what it was identifying as 'Links Depth'. It seemed at times that even though it said '5' my link depth may have only been 4?? (eg: domain/quotes/22189/harry-s-truman/shall-able-remove-suspicion) That is a 4 right?
As well, it may have been on a 4 or 5 and showed it was indexing this page: "domain/quotes/keyword/funny" or even back to a link such as: "domain/quotes/author/maurice-chevalier" when it had already worked itself deeper at earlier points.
Does the link depth start with the root? or the first level down from it?
I'm asking that question because optimally I would like to attempt a site map for sections as opposed to a max url count down to a specified depth.
The site is well organized. So, I'd like to index say just:
- domain/quotes/author/{author-name} (and go no lower)
- domain/authors/type/{author-type} (and go no lower)
- domain/quotes/topic/{topic-name} (and go no lower)
How would I achieve that?
Set the "Parse ONLY" URLs: to domain/author/ (like that specifically?? I am terrible with server paths) and then set "Maximum depth level" to what?
I appreciate your help and advise!
Darren