I don't have the reference for it here, but in the first year Windows Vista was out >50% of crashes were due to buggy nVidia drivers. Microsoft assumed (incorrectly) that their ecosystem would get their shit together automatically. That nVidia would make solid drivers for the new Windows WDDM driver model.
The year before Windows 7 came out I was working at a company (DivX ;-) making Windows software. We were getting contacted by different testing groups at Microsoft constantly. Some weeks three different people might contact me. Somehow they found my phone number? It didn't seem very efficient, but it was clear that they were allocating huge resources to this.
They found unbelievably nuanced bugs (their QA was better than ours...). They wanted to know when we were going to have a fix by. They wanted to make sure they didn't have a Vista repeat. Vista SP1 was actually quite stable, but it was too late for the Vista brand already.
With Windows 7 it seemed clear that the thing they cared about was: the software their users actually use still works after the update. Right or wrong, it was very user centric, because what user likes for their software to break? Nobody cares why.