If you ask a flat earther where they recommend eating, they’re not going to interweave restaurants that exist with restaurants that don’t, but have plausible sounding restaurant names. Or if you ask for the web address of those restaurants, the flat earther will say “I don’t know, google it.” They won’t just make up plausible sounding URLs that don’t actually exist.
Hallucinations for LLMs are at a different level and approach every subject matter. Because it’s all just “predict the next word,” not “predict the next word but only if it makes sense to do so, and if it doesn’t, say you’re not sure.”