All joking aside, that's not a bad solution to the underlying problem. Fundamentally, unstructured data in shell pipelines is much of the issue, and JSON can be used to provide that structure. I'm seeing more and more tools emit or accept JSON. If one can pinch their nose and ignore the performance overhead of repeatedly generating and parsing JSON, it's a workable solution.
Years ago, a project idea I was really interested for a while was to try to write a shell in Rust that works more like PowerShell.
Where I got stuck was the fundamentals: PowerShell heavily leans on the managed virtual machine and the shared memory space and typed objects that enables.
Languages like C, C++, and Rust don't really have direct equivalents of this and would have to emulate it, quite literally. At that point you have none of the benefits of Rust and all of the downsides. May as well just use pwsh and be done with it!
Since then I've noticed JSON filling this role of "object exchange" between distinct processes that may not even be written in the same programming language.
I feel like this is going to be a bit like UTF-8 in Linux. Back in the early 2000s, Windows had proper Unicode support with UTF-16, and Linux had only codepages on top of ASCII. Instead of catching up by changing over to UTF-16, Linux adopted UTF-8 which in some ways gave it better Unicode support than Windows. I suspect JSON in the shell will be the same. Eventually there will be a Linux shell where everything is always JSON and it will work just like PowerShell, except it'll support multiple processes in multiple languages and hence leapfrog Windows.