They don't indeed. But then, human goals do not generally go after survival of humanity directly.
A decent enough AGI with access to resources would probably figure out it is costly to wage war.
Heck, an equal probability is that an AGI will follow a bounded version of zeroth law or even nonviolence.
That is assuming it does not place value on humans based on our history of research and development. I would expect any action more akin to forced upload or upgrade instead.
Anyway, at that point we might not resemble present day humans anymore.