with robots.txt processing enabled, the sitemap generator exits immediately (both 6.1 and 7.1)
I use a block of agents (maybe it doesnt like it) i.e.
User-agent: *
Disallow: /
User-agent: googlebot
User-agent: bingbot
User-agent: AhrefsBot
User-agent: SemrushBot
User-agent: MJ12bot
Disallow:
Here's the report:
============================================================
2015-07-15 09:08:01
(memory up: 1,567.2 Kb)
0 | 0 | 0.0 | 0:00:01 | 0:00:00 | 0 | 1,567.2 Kb | 0 | 0 | 1567
[ 1 - , 1]
NEXT LEVEL:1
({skipped - })
(memory: 1,509.9 Kb)
(saving dump)
Crawling completed
<h4>Completed</h4>Total pages indexed: 0
<br>Creating sitemaps...
and calculating changelog...
<div id="percprog"></div>
Creating HTML sitemap...<div id="percprog2"></div>sorting.. | | 0.0 | 0:00:00 | 0:00:00 | | | | | 0
| | 0.0 | 0:00:00 | 0:00:00 | | | | | 0
*** *** [ External links are visible to forum administrators only ]
*** time: 10.263481855392 ***
| | 0.0 | 0:00:00 | 0:00:00 | | | | | 0
| | 0.0 | 0:00:00 | 0:00:00 | | | | | 0
<br />Done, redirecting to sitemap view page. <script> top.location = 'index.php?op=view' </script>