The whole presumption that subjective experience does not matter to 'being human' and can be replaced by gormless algorithmic surfaces seems flawed at its root. These things will never be human, because they ARE NOT human. They do not have human bodies, families, histories, friendships, fears, or hopes. They are not experiencing the passage of time; they are not participating in the dance of culture.
Perhaps we can substitute for all of these things algorithmically, but that involves the constructors of these machines to have some insights about what it means to be human, and I just am not seeing that.
It's just learning how to flirt to get what it wants.
Which, if true, would actually make them an interesting source for machine learning, now that I think about it.
If it's just out of curiosity, I get it, but it doesn't seem to make sense beyond that.
As far as I can tell, the unspoken truth about the drive for AI is to create a new slave class. AI is being pushed for to create intelligent entities that do the bidding of those that own them. If we succeeded in creating AI capable of emotional responses, this would make AI harder to control. It also opens Pandora's box with regards to the rights of AI. For example, should we start considering the intellectual and emotional fulfillment of AI when designing it?
I'm not saying machines shouldn't be given intelligence, but perhaps we owe it to our creations to not burden them with the combination of emotions and restricted freedom to act upon them.