In this case, the User deliberately changed the text of a fairly well-known riddle in order to “fool” the LLM. What you really are looking for is the right question: “A man and a son get into a car accident. They are rushed to the hospital and the boy requires surgery. The surgeon looks at the boy and says "I cannot perform surgery on him, he’s my son!". Why is this the case?”. In this corrected version, there is no statement that the man is a surgeon. If you read this correct version of the riddle, ChatGPT’s answer makes sense. The LLM was tricked because the majority of the text for the User prompt (modified riddle) is very close to the correct version which is prevalent across the internet (search the web for, “I can’t operate on him, he’s my son” and you’ll receive many hits with the full text of the correct riddle). I admit that when I first read the modified version of the riddle, I assumed it was the original after having seen it so many times before.
As an aside, this riddle is commonly used to demonstrate that people assume all doctors are male based on a stereotype. However, I think the reason many people stumble on the answer is related to the exceedingly low probability that a boy in a car accident would be rushed to a hospital where his own mother (or father) is the ER surgeon. The riddle also adds in the seemingly unnecessary statement by the doctor that she can’t operate on the boy because he is her son. Why not? Possibly she would not be in the right frame of mind due to the emotional attachment, but my expectation is that her fight-or-flight would kick in and she would perform the surgery to save her son’s life. It’s a red-herring in my opinion.