It still doesn't stop crawling.
When the pass is completed,
"crawling completed" is written to debug.log, sitemap.xml and sitemap_images.xml are overwritten, the .gz-Versions too.
debug.log gets truncated and crawling continues.
When I click the interrupt link in the web frontend, it displays "The "stop" signal has been sent to a crawler." The run button is displayed, as if it had really stopped, but tail -f debug.log shows that the crawler is still active.
I upload an interrupt.log, that gets deleted, debug.log is truncated and starts growing again.
When I click "run" with "resume" and "Run in background" unchecked, the crawling tab shows progress somewhere around "Links depth: 3".
Am I the first to report this?