Gee, can't believe I didn't include the error. The script would error due to not enugh memory. I have upped to my maximum memory allowed that is 128MB. I am running the script right now and am at 110MB and only 24000 of about 300 000 pages in que are in the site map so far. So I don't think I'll have it all crawled as the memory will be exusted agian. On this run I have the script set to pause for 90 sec every 1000 pages. That did help get it that far.
The reason for the post was the lst time I ran the script I used 55MB or memory for only 9000 pages. Seems like a lot of memory. I have tired using both crawl methods listed at the bottom of the config page.
Thanks