Unlike artwork, precision and correctness is absolutely critical in coding.
Accomplishing that is achieving general AI.
In the meantime, there are plenty of boilerplate ORMs and simplistic API template tools that make production of bog standard CRUD apps dead simple. Of course, they all have their drawbacks and trade-offs, and aren't always suitable. But I don't see the amount of software engineering work reducing as a result of these no-code, low-code tools, do you?
Bytecode -> Assembly -> C -> higher level languages -> AI-assisted higher-level languages
Above a certain threshold of ability, yes.
The same will hold true for designers. DALL-E-alikes will be integrated with the Adobe suite.
The most cutting edge designers will speak 50 variations of their ideas into images, then use their hard-earned granular skills to fine-tune the results.
They'll (with no code) train models in completely new, unique-to-them styles--in 2D, 3D, and motion.
Organizations will pay top dollar for designers who can rapidly infuse their brands with eye-catching material in unprecedented volume. Imitators will create and follow YouTube tutorials.
Mom & pop shops will have higher fidelity marketing materials in half the time and half the cost.
All will be ever as it was.
The space for "AI-assisted higher-level languages" sufficiently distinct from natural language is vanishingly small. Eventually you're just speaking natural language to the computer, which just about anyone can do (perhaps with some training).
AI that can write code from a natural language description doesn't help as much as you seem to think if natural language description is too hard to actually bother with when humans (who obviously benefit from having a natural language description) are writing the code.
Now, if the AI can actually interview stakeholders and come up with what the code needs to do...
But I am not convinced that is doable short of AGI (AI assistants that improve productivity of humans in that task, sure, but that expands the scope for economically viable automation projects rather than eliminating automators.)
At some point AI will become as powerful as companies.
And then AI will be able to sustain positive feedback loop of creating more powerful company like ecosystems that will create even more powerful ecosystems. This process will be fundamentally limited by available power and the sun can provide a lot of power. Eventually AI will be able to support space economy and then the only limit will be the universe.
We will be united with the AI, we're already relying on it so much that it has become a part of our extended minds.
So, today some good AI applications are face detection, fingerprint detection, or generating art. Where you need to catch or generate the general gist of it without pixel precision.
Of course, programming might be under greater threat than we imagine. I can also not claim that anyone holding that position is just plain _wrong_. But I do believe that would take an AI breakthrough that is yet to happen. That breakthrough would also have absolutely crazy consequences beyond programming, because now we would have "exact AI" and the thought of that boggles my mind for sure.
Claiming AIs are going to take over or destroy the world has been a basis of "AI safety" research since the 90s, but that isn't real research, it's a new religion run by Berkeley rationalists who read too many SF novels.
Evolution doesn't stop for anyone, don't think like a dinosaur.
You thought climate change is hard to hold up? Try holding up the invention of AI. The whole world is going to have to change and some form of socialism/UBI will have to be accepted, however unpalatable.
But the reverse is not true, they won't be able to properly vet a piece of code generated by an AI since that will require technical expertise. (You could argue if the piece of code produced the requisite set of output that they would have some marginal level of confidence but they would never really know for sure without being able to understand the actual code)