As to baking variance into the runtime, I think this is just a bad idea, which is so far used only in C++ and .NET, two languages/runtimes with notoriously bad interop (it's not just Scala; Python and Clojure have a similarly bad time on the CLR, as would any language not specifically built for .NET's variance model). It is simply impossible to share a lot of library code and data across languages with different variance models once a particular one is baked into the platform. This is too high a price for a minor convenience.
Specialization for value types (which are invariant), is another matter, and, indeed, it is planned for Java. Perhaps some opt-in reification for variant types has its place, but not across the board. I am not aware of other platforms that followed in .NET's misplaced footsteps in that regard. Those that are known for good interop -- Java, JS and LLVM, don't have reified generics.
What's worse is that it's a mistake that cannot be unmade or resolved at the frontend language level. Even Java's big mistakes (like finalizers, how serialization is implemented, nullability and native monitors) are much more easily fixed.