This is a weirdly aggressive reply. I don't "read Google's blogpost," I use TPUs daily. As for MLPerf benchmarks, you can see for yourself here:
https://mlperf.org/training-results-0-6 TPUs are far ahead of competitors. All of these training results are openly available, and you can run them yourself. (I did.)
For MLPerf 0.7, it's true that Google's software isn't available to the public yet. That's because they're in the middle of transitioning to Jax (and by extension, Pytorch). Once that transition is complete, and available to the public, you'll probably be learning TPU programming one way or another, since there's no other practical way to e.g. train a GAN on millions of photos.
You'd think people would be happy that there are realistic alternatives to nVidia's monopoly for AI training, rather than rushing to defend them...