Using epoch for dates makes simple math & before/after comparison easier but requires explicit conversion during serialization.
Unfortunately from what I can tell, I-JSON doesn't appear to solve this problem (or does it??) One nice thing about BSON is they made Date types first class citizens of the format.
It's not within the scope of what I-JSON is intended to address. This is a more formally specified and slightly more constrained variation of the existing JSON spec. Adding new datatypes would mean that it would no longer be JSON.
Maybe something like "String values that match (some computationally inexpensive ISO-8601-matching regex) shall be converted to Date instances by the JSON parser" could be possible without huge compatibility issues.
I guess I was mostly using this topic to voice what I'd imagine is a common point of frustration in an otherwise great data interchange format.
Faster, smaller, pretty sure it will parse on the other end.
4.3. Time and Date Handling
Protocols often contain data items that are designed to contain
timestamps or time durations. It is RECOMMENDED that all such data
items be expressed as string values in ISO 8601 format, as specified
in [RFC3339], with the additional restrictions that uppercase rather
than lowercase letters be used, that the timezone be included not
defaulted, and that optional trailing seconds be included even when
their value is "00". It is also RECOMMENDED that all data items
containing time durations conform to the "duration" production in
Appendix A of RFC 3339, with the same additional restrictions.takes a brief moment to contemplate the reality of his existence
The flip side of this is that if you're not using ISO 8601 in your JSON, you're doing worse than Salesforce. That's about as good of an incentive I can give this community to try to standardize their organizations around this standard!
but the plaintext version does:
http://www.rfc-editor.org/rfc/rfc7493.txt
Maybe the document is still in flux and the text version represents a later addition?
EDIT: Oops, my first link is to the original JSON spec. Sorry for the confusion!
An I-JSON implementation supporting a "Must-Ignore" policy SHOULD pass any such new protocol elements on, untouched, to any downstream consumers of the message, because those downstream consumers may understand the new elements.
It is another one of those things that are obvious to many people, but I could imagine somebody reading the last sentence of the section, "members whose names are unrecognized MUST be ignored", and thinking that they should omit the unrecognized elements before passing the message on.For instance in my native Sweden, it's very common when e.g. scheduling things at work to talk about which week something will happen.
Sites like http://vecka.nu make it easy to check the current week number (the domain name translates as "week.now") whenever you have internet access, paper calendars include it, and so on. It's really very common.
[edit] Actually, it looks like the guts of this tool live here: https://github.com/zaach/jsonlint
Also, if you're going to stash binary data in to quoted JSON strings then why base64url encoding and not Z85[0]? It's more efficient and easier to decode.
Using schema-less formats generally sucks.
In JavaScript Object Notation, everything's a float. There are no integers, just floats that happen to not have any fractional elements.
Numbers are the weakest part of the JSON spec in general because Javascript has very weak numbers. A spec merely intending to tidy up certain questionable corners hasn't really got license to "fix" the numbers problem.
(And just to make it clear one more time, I fully agree that the numbers are really problematic in JSON... I just don't think that problem can be fixed here. It would take a JSON 2.)
It has been said that the limit of the ratio of (creating, groking, interacting with) the hundreds to thousands of XML schemas/documents that invariably creep up in the average large-scale-XML-using enterprise environment versus (doing actual work) increases without bound as t goes to infinity.
ECMA-262 even explicitly uses 7159's predecessor (4627) with two exceptions[1], one of which is the top-level compatibility headache 7159 fixed, and the other just requires the API to disregard the "MAY" in section 4.
[0] https://www.tbray.org/ongoing/When/201x/2014/03/05/RFC7159-J...
[1] http://www.ecma-international.org/ecma-262/5.1/#sec-15.12
object --> array --> object --> array
I must have spent a few hours trying to figure out why Angular wasn't iterating through the first array, only to discover that it was being parsed like so:
object --> object --> object --> object