I don't expect renting computers in the cloud to get more expensive. Do you? The end of Moore's law means slower improvements, not that things will actually get worse. If anything, it means computers in data centers need to be replaced less often, so on a per-hour basis, they're still cheaper.
For a scalable application, better performance basically means reducing expenses. If your cloud computing bills aren't high to begin with, it may not be worth the rewrite. Of course there might be new companies or new projects that can use TPU's for machine learning, etc.
But Moore's law is not the only way to scale better. Today's machine learning algorithms are ridiculously inefficient and that seems unlikely to remain true forever, given the amount of research being done. A series of algorithmic improvements might result in a 10-100x reduction in cost, or maybe even not needing TPU's anymore?
Who knows what the future will bring, but making straight-line predictions in a fast-moving field like machine learning seems unlikely to work out.