Totally get the point -- in the ER it is a mix of narrative and tests, so hopefully not so many. But in general practice, probably more BS than any of us would like.
Part of the difficulty with healthcare (that I myself don't have an easy answer to), is that you have very asymmetric error costs. So chest pain, even if heart attack is low probability, you should be risk adverse in many scenarios and still do due diligence to eliminate that as a possibility.
Not saying that smart systems are not possible, just still skeptical that level of complexity is possible on the backbones of LLMs. The smart sounding narratives I am concerned are mostly red herrings though in terms of outcomes we care about, accuracy/making right decisions/etc.