You write: 'Putting a zero into an int certainly does not signify "no value". It signifies a value, and that value is zero.'
Again, the way you write this makes it seem like you say that this has always been the case and is the only way it could be, when in fact this idea of zero being "just another value" is a fairly recent one. And the way our machines handle scalar values (and where what you write is largely true) is even more recent and more of a special case.
'What number is greater than three but less than one? Nothing. There is no such number. "0" is not a correct answer.'
'Nothing' is also not the correct answer. What you have is a contradiction, which is not nothing. Or you could talk about the possible results as sets, in which case you have the empty set, which is also not nothing. It does have the cardinality 0, though.
"The empty set is not the same thing as nothing; rather, it is a set with nothing inside it and a set is always something. "
Of course, the empty set can be used to signify "nothing", just as the number zero can.
"The number zero is sometimes used to denote nothing. The empty set contains no elements."
Again, not arguing that you can't have contexts in which "nothing" and "zero" are distinct. Just pointing out that those contexts are hardly universal enough to justify a statement saying that nothing really isn't zero.
Things are a bit more complicated and less clear-cut than that.
Which is why I always liked C's and Objective-C's somewhat loose but intuitively (for me!) workable handling of NULL, 0, nil, false etc.