I changed it to 256M and 9000, and the internal server error is gone. However, now the crawler is timing out after only about 5 minutes and 350 pages of 3000 crawled.
Crawler is stopping at this point:
Links depth: 2
Current page: zip-code-search/85050-phoenix-az/
Pages added to sitemap: 156
Pages scanned: 160 (8,031.2 KB)
Pages left: 350 (+ 3611 queued for the next depth level)
Time passed: 0:04:31
Time left: 0:09:54
Memory usage: 3,227.4 Kb
auto-restart monitoring: tue apr 26 2011 10:17:00 gmt 0700 (100 seconds since last update)
tue apr 26 2011 10:15:00 gmt 0700: resuming generator (120 seconds with no response)
tue apr 26 2011 10:13:00 gmt 0700: resuming generator (120 seconds with no response)
it goes 120 seconds no response and then goes through same message continually