Everything in my ./node_modules is unmocked, though.
On a previous project, we used karma+jasmine and the mocking became extremely cumbersome very quickly. Setup/teardown for a given module was up to 5-10x longer than the actual tests for said module. It might have been an artifact of applying unit tests to a codebase that was never tested properly before (and then not having time to refactor).
b. It 'just worked' for me. I'm happily generating lcov / clover reports with jest 0.4.17 `jest --coverage`. I even have it in my continuous build setup, so my coverage reports update every time I save. There's weirdness where it will report a branch or function as ignored, and I can't find which branch that is ... but I think that may be an ignored/missed function in the transpiled source (I work in ES2015).
c. Yes. And I love them. Default setup is to run these in JSDom (headless). We're considering doing full end-to-end tests with selenium in an e.g. jenkins environment, but haven't come to a decision yet - as much of that testing would be duplicating the already-written component unit tests.
It is slowing down a little bit. My test suite takes 21 seconds to run right now (including coverage analysis), which is pretty slow considering the number of components I have. Some of the async modules run even slower, though - it hasn't been a sticking point so far, but it will be fairly soon. Average test per component is ~3 seconds, with the longest taking 10 seconds. Test suite runtime doesn't grow linearly, though: average component test is 5.34s; average non-component test is 0.8s; total time taken by tests is 63s; actual (perceived) time for the entire test command including webpack build is 21 seconds.