Which is my point, this is not about replacement, it's about reducing the need and increasing supply.
LLMs absolutely help me pick up new skills faster, but if you can't have a discussion about Rust and Svelte, no, you didn't learn them. I'm making a lot of progress learning deep learning and ChatGPT has been critical for me to do so. But I still have to read books, research papers, and my framework's documentation. And it's still taking a long time. If I hadn't read the books, I wouldn't know what question to ask or how to evaluate if ChatGPT is completely off base (which happens all the time).
I fully understand your point and even agree with it to an extent. LLMs are just another layer of abstraction, like C is an abstraction for asm is an abstraction for binary is an abstraction for transistors... we all stand on the shoulders of giants. We write code to accomplish a task, not the other way around.
Yes.
Can you not?