Looking at LLMs and thinking they will lead to AGI is like looking at a guy wearing a chicken suit and making clucking noises and thinking you’re witnessing the invention of the airplane.
It's more like looking at grided paper and thinking that defining some rules of when a square turns black or white would result in complex structures that move and reproduce on their own.