The brain is trying to 'predict' the next sensory input, and that prediction is our awareness. What we would call our 'conscious self'.
It makes point of calling it a 'controlled hallucinations', in that what we experience as our self. "Hallucination" being the experience we have as our brain 'predicting/controlling' for the sensory input. So All inputs come together in a 'hallucination', but it is averaged 'Bayesian', with the actions we are taking at same time. So Action + Prediction = Self.
It is funny that using the word 'hallucinate' in AI has become so common and it is also used in Humans. And so few people seem to make connection that they are actually very similar, and far from being an argument against AI consciousness, is argument for how similar they are.