There's also plenty of argument to be made that it's already here. AI can hold forth on pretty much any topic, and it's occasionally even correct. Of course to many (not saying you), the only acceptable bar is perfect factual accuracy, a deep understanding of meanings, and probably even a soul. Which keeps breathing life into the old joke "AI is whatever computers still can't do".
I think the main problem with AGI as a goal (other than I don't think it's possible with current hardware, maybe it's possible with hypothetical optical transistors) is that I'm not sure AGI would be more useful. AGI would argue with you more. People are not tools for you, they are tools for themselves. LLMs are tools for you. They're just very imperfect because they are extremely stupid. They're a method of forcing a body of training material to conform to your description.
But to add to the general topic: I see a lot of user interfaces to creative tools being replaced not too long from now by realtime stream of consciousness babbling by creatives. Give those creatives a clicker with a green button for happy and a red button for sad, and you might be able to train LLMs to be an excellent assistant and crew on any mushy project.
How many people are creative, though, as compared to people who passively consume? It all goes back to the online ratio of forum posters to forum readers. People who post probably think 3/5 people post, when it's probably more like 1/25 or 1/100, and the vast majority of posts are bad, lazy and hated. Poasting is free.
Are there enough posters to soak up all that compute? How many people can really make a movie, even given a no-limit credit card? Have you noticed that there are a lot of Z-grade movies that are horrible, make no money, and have budgets higher than really magnificent films, budgets that in this day and age give them access to technology that stretches those dollars farther than they ever could e.g. 50 years ago? Is there a glut of unsung screenwriters?
The first iteration produced decent code, but there was an issue some street numbers had alpha characters in it that it didn't treat as street numbers, so I asked it to adjust the logic of code so that even if the first word is alpha or numeric consider it a valid street number. It updated the code, and gave me both the sample code and sample output.
Sample output was correct, but the code wasn't producing correct output.
It spent more than 5 mins on each of the iterations (significantly less than what a normal developer would, but the normal developer would not come back with broken code).
I can't rely on this kind of behavior and this was a completely green field straight forward input and straight forward output. This is not AGI in my book.
You consider occasionally being correct AGI?
Given you start with that I would say yes the /s is needed.
A 4 year old isn’t statistically predicting the next word to say; its intelligence is very different from an LLM. Calling an LLM “intelligent” seems more marketing than fact based.
I don't think I'll ever stop finding this funny.
It was written by Opus 4 too.