* Being excited to be able to write the pieces of code they want, and not others. When you sit down to write code, you do not do everything from scratch, you lean on libraries, compilers, etc. Take the most annoying boilerplate bit of code you have to write now - would you be happy if a new language/framework popped up that eliminated it?
* Being excited to be able to solve more problems because the code is at times a means to an end. I don't find writing CSS particularly fun but I threw together a tool for making checklists for my kids in very little time using llms and it handled all of the css for printing vs on the screen. I'm interested in solving an optimisation issue with testing right now, but not that interested in writing code to analyse test case perf changes so the latter I got written for me in very little time and it's great. It wasn't really a choice of me or machine, I do not really have the time to focus on those tasks.
* Being excited that others can get the outcomes I've been able to get for at least some problems, without having to learn how to code.
As is tradition, to torture a car analogy, I could be excited for a car that autonomously drives me to the shops despite loving racing rally cars.
I personally don't like it when others who don't know how to code are able to get results using AI. I spent many years of my life and a small fortune learning scarce skills that everyone swore would be the last to ever be automated. Now, in a cruel twist of fate, those skills are being automated and there is seemingly no worthwhile job that can't be automated given enough investment. I am hopeful because the AI still has a long way to go, but even with the improvements it currently has, it might ultimately destroy the tech industry. I'm hoping that Say's Law proves true in this case, but even before the AI I was skeptical that we would find work for all the people trying to get into the software industry.
Those jobs still exist, but by large are either very niche or work using that tech in some way.
It is not wrong to feel down about the risk of so much time, training, etc rapidly losing value. But it also isn't wrong that change isn't bad, and sometimes that includes adjusting how we use our skills and/or developing new ones. Nobody gets to be elite forever, they will be replaced and become common or unneeded eventually. So it's probably more helpful for yourself and those that may want to rely on you to be forward-thinking rather than complaining. Doesn't mean you have to become pro-AI, but may be helpful to be pragmatic and work where it can't.
As to work supply... I figure that will always be a problem as long as money is the main point of work. If people could just work where they specialize without so much concern for issues like not starving, maybe it would be a different. I dunno.
Sounds like for many programmers AI is the new Visual Basic 6 :-P
AI is addressing that problem extremely well, but by putting up with it rather than actually solving it.
I don't want the boilerplate to be necessary in the first place.
There might have been people who were happy to write assembly that got bummed about compilers. This AI stuff judt feels like a new way to write code.
Inevitably AI will writes things in ways you don't intend. So now you have to prompt it to change and hope it gets it right. Oh, it didn't. Prompt it again and maybe this time will work. Will it get it right this time? And so on.
It's so good at a lot of things, but writing out whole features or apps in my experience seems good at first, but then it turns out to be a time sync of praying it will figure it out on this next prompt.
Maybe it's a skill issue for me, but I've gotten the most efficiency out of having it review code, pair with it on ideas and problems, etc. rather than actually writing the majority of code.
It is really like micro-managing a very junior very forgetful dev but they can read really fast (and they mostly remember what they read for a few minutes at least, they actually know more about something than you do if they have a manual about it on hand). Of course, if its just writing the code once, you don't bother with the junior dev and write the code yourself. But if you want long term efficiency, you put the time into your team (and team here is the AI).
Not everyone needs to be excited about LLMs, in the same way that C++ developers dont need to be excited about python.
I'm my mind, writing the prompt that generates the code is somewhat analogous to writing the code that generates the assembly. (Albeit, more stochastically, the way psychology research might be analogous to biochemistry research).
Different experts are still required at different layers of abstraction, though. I don't find it depressing when people show preference for working at different levels of complexity / tooling, nor excitement about the emergence of new tools that can enable your creativity to build, automate, and research. I think scorn in any direction is vapid.