No. It's a language to scare - and scar - young programmers for life.
true == 1 → true
true == "1" → true
false == 0 → true
false == "0" → true
false == undefined → false
false == null → false
null == undefined → true
"\t\r\n" == 0 → true
"\t\r\n 16 \t\r\n" == 16 → true
"\t\r\n 16 \t\r\n" == "16" → false
var a = "foobar"
var b = typeof a → 'string'
var c = a instanceof String → false
I think this is something you need to blame UNIX for. Or who knows, maybe something made even earlier.
That said, there is a logic behind that. Days we usually represent with numbers. Months we often represent with names. So the month numbers are returned as an index of a sequence of names. (Not saying I like this logic, but it sort of makes sense.)
Continuing forward with bad designs doesn't absolve you from them, though. Plenty of languages avoid this--look at how .NET handles dates and times, it has no real relation to the underlying OS on either Windows or *nix.
myArray.forEach( (item) => {
// process item
});
doesn't seem ugly to me. It's flexible and concise. for (item in myObject) {
// process item
}