I personally think that the really hard part in creating an AGI is going to be training. As an adult human, one can look at an object and instantly know what its texture is going to feel like on one's fingers or even lips, how it will bounce and if it will shatter. How did we gain that knowledge? As babies we put absolutely everything we saw into our mouths and sucked or chewed on it. As kids we played for hours on end with all kinds of toys and household objects. Coming up with a way of training an AGI to be human level will be hard, as there is such a vast scope of knowledge in "common sense" that will be exceedingly difficult to implant into an AGI without the AGI having an ability to interact with the real world.
On the other hand, once that body of knowledge becomes an available training data set, evolution can take off at speeds otherwise impossible in the real world.
Acronyms in headline are not the place to put them unless it’s very certain the target audience is innately familiar. Guess I’m not that audience.
Especially when the headline poses a counterfactual/impossible question they should help as much as possible.
How could you forget "Industry Standard Architecture", the bane of my youth?
I once had an old 266 MHz pentium with an ISA network card that was just an absolute nightmare. This was when PCI was dominant and AGP graphics cards had not yet become mainstream, and I guess the ISA drivers just weren't being maintained for Win 98.
I had to go into Control Panel and disable/re-enable the Ethernet adapter about three times an hour, because the connection would simply drop and that was the only rectification that I was savvy enough to attempt.
Relatedly, I'm working on natural language understanding, which I believe is key to AGI. https://lxagi.com