> I think that part is a leap. I don't think is given that a super intelligent AI will "want" things.
But if it has no goal then it can’t act rationally or intelligently. Something like an LLM might not appear to “want” anything, but it “wants” to predict the next token correctly which is still a goal (though since it’s only related to its internal state it might be a little safer)
There’s another good video about why this would be the case here if you’re interested: https://youtu.be/8AvIErXFoH8
> This feels like we're projecting aspects of humanity that evolution specifically selected for in our species with something that is coming about though a completely different process.
That’s because evolution is a process that optimises for a goal. The only reason altruism is a thing is because it actually indirectly benefits the goal, which is for our genes to survive and be passed on, and fellow humans tend to share our genes, especially relatives (who we tend to be kinder to). AI training is also a process that optimises for a goal, but unless having humans around helps that goal it wouldn’t display any human empathy. In this case “selfishness” is just efficiency which a training process definitely selects for
> I agree, but I feel like that's what these concerns about AI are doing, because that's what people do.
I feel like they’re doing a pretty good job at modelling AI as a theoretical agent, which does share some similarities with humans because humans are agents, but the main mistake people make is assuming their goals will be similar to humans because human values are somehow a universal truth
> It also seems to me there is a huge gap between a super intelligent AI and the ability to have a perfect model of reality along with the ability to evaluate within that model the effect of every possible sequence of packets sent out to the internet.
That’s very true, it’s an unrealistic thought experiment, but it’s a a good introduction to the concept that something significantly more intelligent than us can be dangerous and pursue a goal with no regard to what we actually wanted