I came here to suggest this exact thing. Humans clearly have some understanding of the meaning behind words and sentences. However, I think it's wrong, an overgeneralization, to suggest that ChatGPT is just statistically predicting the next word. While that may technically be true, I think buried deep within it has ways of modeling/encoding common concepts and ideas, maybe similar to how a human models concepts and ideas in their mind.
Then there's the whole problem of consciousness, but I won't get into that here.