I never realized it was controversial. I think I've always included 0 in the nat numbers since learning to count.
But there are some programming books I've read, I want to say the Little Typer, or similar, that say "natural number" or "zero". Which makes actually confuses me.
Just like a negative numbers, it's a higher-level abstraction or a model, not a direct observation from the Nature
Likewise, the digit "0" originating from the Hindu-Arabic numeral system[1] is merely a notation, not a number
---
1. https://en.wikipedia.org/wiki/Hindu%E2%80%93Arabic_numeral_s...
From one point of view, zero never appearing in nature is exactly an example of it appearing in nature!
From another point of view, do you not think a prairie dog has ever asked another prairie dog, "how many foxes are out there now?" with the other looking and replying "None! All clear!"? Crows can count to at least 5, and will count down until there are zero humans in a silo before returning to it. Zero influences animal behavior!
From a third point of view, humans are natural, so everything we do appears in nature.
From a fourth point of view, all models are wrong, but some models are useful. Is it more useful to put zero in the natural numbers or not? That is: if we exclude zero from the natural numbers, do we just force 90% of occurrences of the term to be "non-negative integers" instead?
type PrairieDogFoxCount = NoFoxesAllClear | SomeFoxes 1..5 | TooManyFoxes
type CrowCount = Some 1..5 | UpsideDown 5..1
type HumanProgrammerCount = 0..MAXINT
type HumanMathematicianCount = 0..∞
My point is: "No Foxes - All Clear" is not the same thing (the same level of abstraction) as 0.> From a third point of view, humans are natural, so everything we do appears in nature.
using this definition everything is Natural, including fore example Complex numbers, which is obviously incorrect, and thus invalidates yr argument
> From a fourth point of view, all models are wrong, but some models are useful. Is it more useful to put zero in the natural numbers or not? That is: if we exclude zero from the natural numbers, do we just force 90% of occurrences of the term to be "non-negative integers" instead?
all models are wrong, but some are really wrong
If all u care is the length of the terms, i.e. "Natural" vs "non-negative integers", then what's wrong with 1-letter set names, like N, W, Z ?
I think the usefulness of including 0 into the set of natural numbers is that it closes the holes in various math theories like [1,2]
1. https://en.wikipedia.org/wiki/Peano_axioms
2. https://en.wikipedia.org/wiki/Set-theoretic_definition_of_na...
I observe zero.
I don't think zero is an absence of quantity. I don't think zero is the null set.
You can write types in a programming language, but there are other type theory books that do include zero in the natural numbers. And type theory comes from number/set theory. So it's ok if you decide to exclude it, but this is just as arbitrary.
In fact I'd be happy to write `>=0` or `>0` or `=0` any day instead of mangling the idea of zero representing 0 and zero representing something like `None`, `null` or any other tag of that sort. I don't think the natural world has anything like "nothing" it just has logical fallacies.
zero is the cardinality of the empty set
> I observe zero.
it cannot be observed directly at any static point in time, but it can be observed as a dynamic process when some quantity goes down to empty and back up over time
> In fact I'd be happy to write `>=0` or `>0` or `=0` any day instead of mangling the idea of zero representing 0 and zero representing something like `None`, `null` or any other tag of that sort. I don't think the natural world has anything like "nothing" it just has logical fallacies.
N, W, R, etc. - r just well-known names for sets of numbers, nothing stops us from defining better or additional names for them (with self-describing names)
We can discuss Empty type[1] vs Unit type[2], but I think it goes off-topic
---