As Site Audit is already a great crawler, would be great if you could start analysing log as well from different sources such as uploaded files and FTP.
Figuring out if a page has been crawled by Google's and other bots, what's the rate between traffic and bot crawling, how often a page is crawled, etc.
This would make a huge step for a Swiss army knife SEO tool.