I have, but the point is that I don't have to, because there are transparent distributed builds in Nix that I utilise when I want to compile something big, as you say "real-world", fast. I split software into many small self-contained packages and delegate the work of compiling it all to remote machines configured for faster compilation times, and then I just pull complete binaries onto my laptop (https://nixos.org/manual/nix/unstable/advanced-topics/distri...). Subsequent changes in smaller packages are rebuilt incrementally.
My point and the answer to your question ("And why is this state of affairs tolerated for decades?") is that GHC doesn't have to implement its own version of dev environment bootstrapping, distributed builds and other niceties as part of their core tooling, because it's already solved on other level of dev-infra tooling, they can just provide links to working solutions and explain them, for instance haskell.nix (https://github.com/input-output-hk/haskell.nix)