The amount of parallelism in the human brain is enormous. Not just each neuron, but each synapse has computational capacity. That means ~10^14 computational units or 100 trillion processing units -- on about 20 watts.
That doesn't even touch the bandwidth issues. Getting the sensory input in and out of the brain plus the bandwidth to get all of the processing signals between each neuron is at least another petabit per second. So, on bandwidth capacity alone we are 25+ years away (assuming the last 25 years of growth continues). And in humans that comes with 18 years of training at that massive bandwidth and computational power.
Also, we have no idea what a general intelligence algorithm looks like. We are just now getting multimodal LLMs.
From the computational/bandwidth perspective we are still 30 years from a computer being able to process The information a single human brain does, except while consuming 29+ megawatts of energy. If you had to feed a human 29 megawatts worth of power no business would be profitable. Humans wouldn't even survive.
Sorry, but the notion that we are close to AGI because we have good word predictors is fantasy. But, there will be some amazing natural language human-computer interface improvements over the next 10 years!