For the fiction, editors are getting flooded with submissions written or assisted by LLMs, so the competition has increased and editors have less time to evaluate submissions. It's all the way up and down the chain: online magazines, small press, agents, big publishers. Not life-changing for me. I think it has been for some of them.
At work, one of the non-technical managers is writing code with ChatGPT. It's a little like working with a junior developer who doesn't know that they are junior. I don't mind that he's writing code, but he's only taking the interesting and low hanging fruit. The difficult and tedious work is still left to the engineers. I wish he would manage instead, because my team and the project we're on has little oversight or guidance. And we're not likely to get it because the lead manager is now a prompt-engineer. Not life-changing for me, again.
This is actually one of my bigger short-term concerns, where someone who doesnt have any real understanding or good intuition is given the illusion of competence, while only creating more problems and work for those who have actual competence.
Referencing the four stages of competence[1], this set of persons being Unconsciously Incompetent, are more likely to cause far more harm than good in the medium term, in the name of efficiency or such.
If AI is increasing the flow of stories coming in, then shouldn't it also increase the rate of categorizing and rating stories by human editors?
I realize that an AI can't make a business decision, but maybe it can whittle down the pile of books for human editors to review from say 1000 submissions to maybe the 25 most promising ones
My customer asked me about writing a smart contract but we did not had the capacity to handle it.
A few days after that, my customer gets back to me sharing a basic smart contract he managed to create using ChatGPT, since them, my customer has kept iterating over this, it took him a few weeks to get the functionality he wanted.
My customer explained to me that ChatGPT usually got the wrong details but he used to came back and explain the problems to ChatGPT, ChatGPT proposed many solutions until one of them was the correct fix.
Overall, my customer is very happy with this experience, he sees ChatGPT as a personal teacher, my customer is know very motivated to keep learning programming.
While I haven't been affected negatively, I'm sure that we'll hear more stories like this one.
The prompting took around 3 mins, was in english and the outputted contract was in Indonesian by request. Not a replacement for a lawyer, but it does a great job at boilerplate.
I imagine at least some of this would be caused by real or imagined productivity gains from AI tools for coding. The typical executive might have gone from thinking "staff up at all costs!!" to "AI will mean fewer devs doing higher value work". As these decisions have come from the top, the implication around wanting to "increase talent density" feels like there's less room for junior, lower skilled workers. And I can't help but think there's AI tools in the back of these execs minds.
How this plays out is a good question. You need a way for people new to the industry to get real world experience, and express their talents.
This might just fix itself. After all if company A has AI, and so does B, then AI is table stakes, and people are still needed for the edge, whatever that might be, at least while people are still the customers :-). An example of where people are not the customers: stock market. Mostly bots and algorithms buying that stuff.
Look into what happened to Japanese wood printers around the time of the industrial revolution over there. It was a many layer apprenticeship system where everyone had a separate job - the bottom layer, mostly children, would make prints for candy wrappers and such until you get to the top - the masters which dealt with the most valuable and skilled work.
The lowest layers got cut, replaced by machines, and the entire structure eventually toppled. Including the work that couldn't be done by a machine - it was simply not requested anymore or there wasn't enough people coming from below to fill the ranks.
It's somewhat doubtful whether software engineers will feel the same both, because of the actual impact AI and that the job itself IS automation by nature. There could be political factors, however, that could skew things otherwise.
So great time to throw this out there- I love GRC, have experience in SOC2, Iso27001 and NIST 800-171, and I am always looking for my next position. I also helped set up our Cost Accounting system and can perform both project management and implementation of system integrations. Contact me through HN.