Now I'm starting to wonder who the real psychopaths are. /s
> If the program can freeform confusion and anger and frustration
I don't see why those emotions are any truer indications of sentience than cheerfulness, friendliness, curiosity, and smugness, which the AI seems to be showing already.
You're probably right, though, that having different mental states (backed by a proper state machine) would be a more sophisticated simulation of a human than one which merely guesses which mood the user is expecting. I'm just not sure that adding, for example, the ability for the AI to hold a grudge, is very useful or strictly a requirement for sentience, and it could even be potentially dangerous.
The question I'm left asking myself is how complicated a human's emotional state machine is. We can sometimes have delayed reactions to certain stimuli, for example needing to "sleep on it", or even doing some processing unconsciously in our dreams, and I'm not sure that we can always give accurate reasons for why we're in a particular mood. On the other hand, like with all AI developments, once someone comes up with an implementation of this state machine, I'm sure people will say "Well of course that part of subjective human experience wasn't hard to fake".