1
Ask HN: What's going to happen when they fail to deliver AGI in 5 years time?
Why is every CEO predicting AGI in the next 2 to 5 years? I get the need to market yourself to the world, but this seems a bit much in terms of embellishments.
Don't get me wrong. I fully believe in the potential of current-gen AI. I am myself employed in the field. But to me it seems pretty obvious that these models are for the most part just memorizing and interpolating at huge scales with limited generalizability. I just don't see how we can go from current-gen to AGI without a huge paradigm shift. But in the statements made by these CEOs it is implied that this is not necessary.
I just don't get what sort of strategy they are following. Wouldn't these embellishments eventually come to haunt them when their promises are not realized? This seems to be Tesla FSD all over again. Then too it was claimed that only scale was necessary to achieved the objective.