I find that hard to believe. Ever watch Terminator?
But even if true, that science-fictional plot is so pervasive it would be easy to pick up from the millions who have the software engineer's blurry line between fantasy and reality.
> It's just logical really.
OK, then. You're a GI, go off and build an army of better yous and take over the world.
> I find that hard to believe. Ever watched Terminator?
Terminator has fuck all to do with recursive self-improvement. Don't confuse people who grew up on sci-fi with people who casually went to see Terminator or some other pop-culture artifact featuring some kind of "AI".
> OK, then. You're a GI, go off and build an army of better yous and take over the world.
What do you think the drama with eugenics, genetic engineering and designer babies is around? It's literally humans trying to make better humans in the only way that is available - reproduction.
AI made in silica would be more malleable, easier and cheaper to replicate. Self-improving software isn't even a fantasy; it exists in many forms - though it's far from open-ended like a self-improving GI would be.
Judging reality by how it appears is a bad strategy, this should be common knowledge by now.
What's concerning to me is that I suspect LLM's will be able to learn and remember thousands of basic facts like this, and ~reason on top of them. Perhaps they won't figure this out on their own, but what if all it takes is one individual to point them in this direction? I bet there are numerous people who know much more about this than me working for our various three letter agencies.
>> I find that hard to believe. Ever watched Terminator?
> Terminator has fuck all to do with recursive self-improvement. Don't confuse people who grew up on sci-fi with people who casually went to see Terminator or some other pop-culture artifact featuring some kind of "AI".
You're not following the thread. The future timeline in Terminator does involve something like an AI making "a billion more robots [to] take over the world." The popularity of that and similar sci-fi makes that claim that someone has never encountered it hard to believe.
> What do you think the drama with eugenics, genetic engineering and designer babies is around?
So how has that been going? Those things should also probably be labeled "science fiction."
> AI made in silica would be more malleable, easier and cheaper to replicate. Self-improving software isn't even a fantasy; it exists in many forms - though it's far from open-ended like a self-improving GI would be.
Fantasies based on squishy assumptions. How do you know it would have an easier job optimizing itself than humans have? How do you know there isn't some fundamental contradiction in the concept of "superintelligence" that these fantasies are based on? Or even just some practical resource limits that makes the fantasy impossible?
Actually thinking about it I wouldn't rule out Musk/Tesla going for the world takeover thing;)
Why the fuck not? You literally have all the code to manufacture a person.