Hi,
I've been getting pressure from my ISP for high resource usage by crawling bots, so I changed my robots.txt file to this:
Sitemap: [ External links are visible to forum administrators only ]
Sitemap: [ External links are visible to forum administrators only ]
User-agent: *
Allow: /sitemap.xml
Allow: /sitemap.xml.gz
Allow: /index.php
Allow: /index.html
Allow: /index.htm
Disallow: /
(I'm also running another sitemap application)
I am hoping this works better in decreasing all of the crawling of unneeded areas of my site. However, after making this change, I tried to run XML-Sitemaps again and it produced an error and did not index any pages. I have attached a screen capture of the error.
Can changing the robots.txt file affect the Sitemap application? Or is it something else I am overlooking?
I did just download a new, fresh version of XML-Sitemap in case it was a result of something I had set wrong, but that did not change the error results.
Thank you in advance for your help!