I find ChatGPT is scary BAD for rubber ducking. You have to already have an idea of what's right and wrong to verify. It is insane how often ChatGPT is wrong when I ask it things. like 90% of the time it is wrong. It's probably because everything I ask is way too specific and because it's just pattern matching on roids and not reasoning, it's impossible for it.
Agreed you have to be very careful. The worst case I find very often is hallucination of a library for JS that either doesn’t exist or methods that are completely fictional. My initial response of wow that’s a perfect solution turns quickly into wow what a waste of my time.