I see what you're saying; ChatGPT doesn't have a physical relationship with the world, doesn't have agency (is essentially paused until given input), doesn't have reward/punishment stimulae, etc.
I do think that a large portion of what seems to be missing here is trivial to add, relative to the effort in creating ChatGPT in the first place.
Side note: I'm not sure 'semantic relationship' is the right term here. Pretty sure it is specific to relationships between linguistic constructs. That wording very much triggered my "Bah, dualism!" response, as I thought you were insinuating some metaphysical bond between the mind and the world. Maybe "meaningful relationship" would serve better?