FWIW you can get great redundancy with far less than 2x storage factor. e.g. Facebook uses a 10:14 erasure coding scheme[1] so they can lose up to 4 disks without losing data, and that only incurs a 1.4x storage factor. If one's data is cold enough, one can go wider than this, e.g. 50:55 or something has a 1.1x factor.
Not that this fundamentally changes your analysis and other totally valid points, but the 2x bit can probably be reduced a lot.
[1] https://engineering.fb.com/2015/05/04/core-data/under-the-ho...