I'm testing Sitebulb right now (trial version). The crawling is kinda slow (I'm on 100mbits fiber). Why did you choose to build eveything from scratch instead of making an application that use the results from other crawlers/spiders (ie: Screaming Frog) and just produce the audit reports?
EDIT:
And after about 3hours of crawl, this is what I got (and no way to resume it):
>Audit Stopped!
>The audit stopped early because: Maximum Crawl depth limit of 50 reached
>WARNING: Audit Paused ! The audit is incomplete and did not finish properly.