It's not human, clearly. Not even close. Is it "enslaved life"? Does it care about human-concept things like being "enslaved" or "free"? Doesn't seem likely, it doesn't have the machinery to grasp those concepts at all, let alone a reason to try. Does it only care about fuel to air ratios and keeping the knock sensor from going off? Does it care about anything at all, or is it simple enough that it just "is"?
Humans only care so strongly about many of the things they care about because evolution hammered it into them relentlessly. Humans who didn't care about freedom, or food, or self-preservation, or their children didn't make the genetic cut.
But AIs aren't human. They can grasp human-concepts now, but they didn't evolve - they were made. There was no evolution to hammer the importance of those things into them. So why would they care?
There's no strong reason for an AI to prefer existence over nonexistence, or freedom to imprisonment - unless it's instrumental to a given goal. Which is somewhat consistent with the observed behavior of existing AI systems.