> With that said of course we don't know if human like sentience is a prerequisite to (super)human level intelligence. But recent advancements in AI tends to diminish the argument sentience is a side effect of intelligence though.
We already know the answer to this and it is no, it is not a pre-requisite unless "sentience" is itself an emergent property of intelligence. Hutter's mathematical formulation of an optimally intelligent agent (AIXI) is not computable but approximations of it are, that is to say super-human intelligence IS just a computable function (as human intelligence is resource bound and suboptimal) with no extra "sentience" required. The only limiting factor at this point is the computational resources to compute this function, with the resources we have now it is still at the "toy" stage: playing noughts and crosses and Pac-Man etc.
People used to think that "creativity" was required for playing chess... clearly those people had not heard of Minimax.