Are you asking why a side effect that is actually an entire health problem on its own, is a problem? Especially when there is a replacement that doesn’t cause it?
A lazy doctor combined with a patient that lacks a clear understanding of how ChatGPT works and how to use it effectively could have disastrous results. A lazy doctor following the established advice for a condition by prescribing a medication that causes high blood sugar is orders of magnitude less dangerous than a lazy doctor who gives in to a crackpot medical plan that the patient has come up with using ChatGPT without the rigour described by the comment we are discussing.
Spend any amount of time around people with chronic health conditions (online or offline) and you'll realise just how much damage could be done by encouraging them to use ChatGPT. Not because they are idiots but because they are desperate.
They can be used for isolated, treatment of high blood pressure, but they are also used for dual treatment of blood pressure and various heart issues (heart failure, stable angina, arrhythmias). If you have heart failure, beta blockers can reduce your relative annual mortality risk by about 25%.
I would not trust an LLM to weigh the pros and cons appropriately knowing their syncophantic tendencies. I suspect they are going to be biased toward agreeing with whatever concerns the user initially expresses to them.
[1]