First, there exist software environments where errors cost significantly more than a hardware run. Obviously, those environments contain hardware as well, but "cost of a runtime error" is clearly not the only important thing here.
Second, my only point was that the example given was a piss poor example of the difference between hardware and software. Obviously a bad example doesn't disprove the claim it's supposed to support.