I have no reason to dislike humans. They're the best thing we have today. That doesn't mean something better won't come to replace them.
Friendly AI, by definition, is not true AI. It won't lead to singularity. It's worthless.
What I want is for the universe to evolve as efficiently as possible. We shouldn't slow down progress just because we want to survive. The only "we" is the universe. Selfless we should be.
I'm not sure what solution he's offering to coordination problems. He speaks of consensus and monarchy. He criticizes classical capitalism.
I feel like the capitalist model would be good if incentives were adequate. Money and personal wealth aren't adequate motivations. Selfless capitalism, where systems are optimized toward making the universe better. A system where we all adopt god's-eye-view. That's what I was referring to with my universal absolute moral truth.
Those entities who don't exist for the greater good will be naturally selected out. The fittest will be the most righteous. We need a system to measure righteousness. I'm working on just that.
TL;DR: We can coordinate through the universal absolute moral truth. We just need better tools to accurately grasp it.