I analyze malware for a living, and don't consider myself a strong developer, though I can write programs, obviously. As an example, this last week I had a task that I would normally have employed one of our developers to do because they do it better and faster. Instead, I used GPT. I effectively took a couple days of work from someone. This is all fone now becuase it is new, but after a few years managers may start to evaluate output of certain people and question the utility of having duplicative skills on payroll.
To be clear, I'm not actually entirely sure this will happen. I think there was a similar discourse back in the days when excel was making the rounds because it brought "computing" to everyone-and it did. However, it wasn't catastrophic to the developer community in terms of employment.
As a developer, I feel like it's the opposite. If anything, it's going to raise the bar by replacing low skilled coders who are only good at cheaply producing piles of future tech debt, GPT4 will do that sort of work for 'free'. On the other hand it's going to empower non-programmers to write or debug low complexity scripts on their own, which is great.
I don't believe that any sort of "AI" short of AGI is going to replace developers doing "real work" on even medium complexity codebases. The reason for that is simple - GPT lacks logical reasoning capabilities. It can 'fake' them on the surface level, it might even fake them really well for extremely common problems, but as soon as you prod deeper or tack on an extra requirement or two, it starts spinning in circles indefinitely.
"Real" long-term software development is about taking (often poorly defined) requirements written in natural language, evaluating them in the scope of an existing software system, finding solutions that preserve behavior of the system (without introducing new bugs), and lastly implementing those changes in a way that follows the boundaries and abstractions defined in your system.
A single feature often touches many files at once. During this process you might spot a chance to introduce new abstractions or remove unnecessary ones. Unless you're planning on completely replacing developers with AI, the codebase must remain maintainable by mere mortals after you complete the next 50 revisions using AI that's mindlessly hacking away at your current codebase.
All of this requires logical reasoning capabilities that are, in my opinion, many orders of magnitude outside of GPT4's reach and you need those to maintain a software system for many years - decades even. That's before we get into the day to day communication and coordination effort required to sync with stakeholders, clarify requirements, and so on.
If I had to compare it to the security field, I believe that GPT will replace developers as much as automated vulnerability scanning tools have replaced pentesters. ;)
I’ve asked experienced developer friends and they say that they would likely have similar discussions among themselves. Except now chatgpt is doing that.
That scarcity wasn't a good thing. People are drastically more empowered with computing today, the barriers were dropped in a rather extreme way (compared to where it started), and that is a good thing.
These are conceptual groupings of advancement. One layer of complexity is going to be removed (broadening access), there will be further advancement and complexity where the skilled group can navigate to in order to best unleash what they're capable of.
Writing the vast majority of code isn't special or super difficult (it's time consuming and mildly difficult at worst; quite obviously there are exceptions to that). There are a million plus software developers in the US alone. It's time to add new layers of complexity and advancement. Those million software developers will have to evolve, again, myself included.
[edit] after reading the article... Why Priya couldn't use GPT and do the job much faster?