I just don’t see what point you are trying to make here. Yes, ChatGPT can give the wrong answer given a correct input, but that doesn’t mean it’s not a useful tool.
Think about how GPS can give a bad route, especially if there is construction or snow on the road.
Or how keyboard autocorrect sometimes changes what you wrote into something silly and wrong, even if you originally spelled the word correctly.
Or how OCR and speech-to-text software sometimes makes mistakes.
Or how Google Translate uses unnatural or incorrect word choices sometimes.
Are these not useful tools even though they get things wrong?