I think as more and more people offload their thinking into LLMs we are going to hit a plateau. Innovation will stall and maybe even stop because LLMs need constant new input to improve and we will no longer be producing humans that create high quality things for LLMs to use as high quality inputs
Do you think constant growth is more or less likely than the situation that I outline?