Then everyone can order groceries and plan trips by voice conversation with a smart assistant that actually knows us.
And it will become productized and have tiers and specialities but AGI is not going to be any more "OMG my life is changed foreveaah" type watershed than, say, the iPhone.
Don't. Believe. The. Hype.
That's my take. What do you thinK?
However, don't overlook that the iPhone did change things forever. The difference in lifestyle between 2020 and 2000 is more than 1950 to 1990. Your life might not be different, but it makes a difference to the factory worker who had no savings, but is now an Uber driver who can freely work 16 hours one day and sleep the next day.
We'll likely move a "class" upwards, in that humans will no longer be doing brute labors and driving trucks, but still be scheduling truck routes and telling the AI what to do. Humans will probably be more involved in educating and training, maybe even disciplining stray AI.
I also don't believe we can achieve proper AGI without inventing some kind of emotional state. Maybe AI will develop crushes on their trainer, not so much for sex, but out of gratitude and for survival. Or there could be anger, ambition, envy, which might simply stem from the desire to learn and experiment. Maybe AI may become sweet and manipulative as it predicts patterns in how humans respond. There will also likely be some equivalent of dopamine, and AI could be addicted to staying above 80% charge and so on.
AGI probably won’t get really interesting until 50-100-150 years from now. Ditto for stuff like space travel, self-driving cars or social media.
I’d also question your assertion that the iPhone wasn’t a revolutionary change; we simply haven’t understood the full implications of mobile computing and mobile access to a global network. Remember that the Internet has only been around in a consumer form for roughly 20-25 years. This is an incredibly short period of time, compared to say, the printing press or the steam engine, and we’ve already seen society completely transformed.
Next the logical argument is that technology doesn't stand still. If we start with a human level AI we can gain increased runtime simply by throwing more compute at the problem. There is no particular reason we have to run it in real time why not 10x or 100x or 1000x.
What would you produce in your field if we gave you a century or a millennium between now and next year?
This leaves the last and perhaps obvious point that an AI that can improve itself can use these centuries of run time to work on such improvements.
If you have human level AI tomorrow you don't have merely human level AI for long and once something surpasses you, you are poorly positioned to predict or control it. This is not to say it would inherently be malicious but does your pet cat get much of a vote in the running of your day to day life?
The idea that we develop AGI and it is this program that is what we run on a regular device, and then we can make it 10x smarter effectively by running it on 10 devices or 10x as fast. I think more likely will be AGI will be achieved first on the biggest teraflop supercomputers that we have, and it will for a long time be the app that takes a lot to run. And probably the first AGI will not be quite as smart as a human, but basically we will have no other reference point for what it is as smart as so we will call it that.
Also, I think there will be some sort of non-linearity effects that mean that you can't just "scale up" intelligence by adding more processors. It will work to a point, but then the curve flattens. Consider that the global total IQ is already approx 800 billion, but our planet is still pretty dumb. I mean this to also apply to scaling a "single" AGI up in speed. Linear speed gains will have diminishing returns I think.
Also, I speak about the productization and allocation of it. It will not be this "come one come all" "gather round" everyone can partake sort of thing. It will be a product, like night vision or GPS, and the secret government military uses will get the best quality, and the rest of us will get smarter shopping.
Further, if it really is linearly scalable, then it certainly will be controlled. It will be more controlled than enriched uranium in that case, and even if not so scalable is still going to be very controlled if it is at all transformative.
I think the various technological, political and commercial realities will distinctly flatten/soften/smooth the predicted "singularity" discontinuity blast wave into a humdrum speed bump that appears to most of humanity as a better iPhone (basically).
This is pure speculation. We shall see.
In regards to scaling why can't we use more compute to run the a single ai faster and faster instead of simulating more AI especially if its running on a supercomputer with a high bandwidth communication between nodes?
If between point A and point B you have 100 times the compute why don't you effectively simulate a century of thought for your single AI instead of simulating 100 AIs.
That's sort of like saying a Dyson sphere will make nuclear power obsolete.
AGI is not the same as human intelligence. It has the generality of human intelligence but since it isn't restricted by biology it can scale up much easier and can achieve superhuman performance in pretty much any individual task, group of tasks, or entire scientific or technological fields. That's pretty exciting.
</wild speculation>
<reality>
It's questionable whether the above is possible at all. In all likelihood none of us will see anything even remotely close to this in our lifetimes. We're currently so far away from it that we don't even know how to get started on solving such a problem. Nobody is currently working on this, despite how they're advertising their work.
</reality>
I guess what I'm saying isn't that AGI will be underwhelming, it's that it won't exist at all, at least as far as we are concerned.
But a lot of people dismiss it as just a smartphone.
I guess it depends on your perspective.
Saying AGI will be like iPhone and therefore underwhelming, is not to demean iPhone. But to place in correct perspective the overinflated ( I think ) sense of self importance and impact of so-called AGI.
It's not underwhelming wrt an average week in Techcrunch, but it's underwhelming wrt to the mythos and delusion that surrounds AGI.
Underwhelming does depend on perspective. But my perspective is not to demean iPhone, just to bring AGI big heads back down to earth.