An LLM does have a model of reality. An LLM's reality is built on the experiences (words) it's been feed.
Humans are similar. A human's reality is built on the experiences (senses) it's been feed. There definitely are several major differences, the obvious one being that we have a different sensory input than an LLM, but there are others, like human's having a instinctual base model of reality, shaped by the effects of natural selection over our ancestors.
Just like an LLM can't tell if the reality it's been fed actually corresponds to the "truer" outside reality (you could feed an LLM lies like the sky is plaid in such a way that it would report that it's true), a human can't tell if the reality it's been fed actually corresponds to a "truer" outside reality (humans could be feed lies like we are in true reality, when we're actually all NPCs in a video game for a higher level).
The LLM can't tell if it's internal reality matches an outside reality, and humans can't tell if their internal reality matches an outside reality, because both only have the input they've received to go on, and can't tell if it's problematic or it's incomplete.