Once you attach the nearly limitless loads of meaning available to event-perception (use cognitive mapping in neuroscience where behavior has no meaning, it simply has tasks and task demands vary wildly so that semantic loads are factors rather than simple numbers), LLMs appear to be like puppets of folk psychology using tokens predictably in embedded space. These tokens have nothing to do with the reality of knowledge or events. Of course engineers can't grasp this, you've been severely limited to using folk psychology infected cog sci as a base of where your code is developed from, when in reality, it's almost totally illusory. CS has no future game in probability, it's now a bureaucracy. The millions or billions of parameters have zero access to problems like these that sit beyond cog sci, I'll let Kelso zing it
https://drive.google.com/file/d/1oK0E4siLUv9MFCYuOoG0Jir_65T...