You're talking about net gains in "coding tasks" productivity, I'm talking in productivity gain across the board.
My company deals with an insane amount of customers who use chatgpt to pre-debug their problems before coming to our support. Once they contact our support they regurgitate llm generated BS to our support engineers thinking they're going to speed up the process, the only thing they're doing is generating noise that slows everyone down because chatgpt has absolutely no clue about our product and keeps sending them on wild goose chases. Sometimes they even lie pretending "a colleague" steered them in this or that direction while it's 100% obvious the whole thing was hallucinate and even written by an llm.
I can't tell you how frustrating it is to read a 10 min long customer email just to realise it's just an llm hallucinating probable causes for a bug that takes 2 sentences to describe.