> It does matter, because the flat earther isn't to likely make something up about everything they talk about.
I am less optimistic about this. It seems to me you are vastly overestimating the average person's rationality. Rational types are overwhelming minority. It always amazes me how even my own thin layer of rationality breaks down so very fast. I used to think we live on top of vast mountains of rationality, but now I feel more like we, deep down, are vast ancient Lovecraftian monsters with a thin layer of human veneer.
I'm not arguing that LLMs today are comparable to how humans can maintain a perspective and contain their own "hallucinations", but I am arguing that it is a matter of quantity, not quality. It's a matter of time (IMO).
Hallucinations for LLMs are at a different level and approach every subject matter. Because it’s all just “predict the next word,” not “predict the next word but only if it makes sense to do so, and if it doesn’t, say you’re not sure.”