https://en.wikipedia.org/wiki/ISO_8601
Handed down by the ISO, The Great Compromise allows YYYY-MM-DD (or YYYYMMDD if you're in a hurry) but the version with slashes I'd find ambiguous and upsetting, especially early in the month.
The standard is good, and you can get it from `date -I`. Hell mend anyone who messes with the delimiters or writes the year in octal or any other heresy.
(Example - you want to know if a person is old enough to buy cigarettes, and you need to store a birthday that you can compare against the current day to see if they're legally 18 - if you store an epoch at UTC, do you store the time of day they were born? That's not the way the law works. Do you store midnight UTC? If they're currently in NY, can they but cigarettes at 7pm the day before their birthday because they're currently 18 in London?)
Sometimes you need a logical calendar date, not a point in time in the history of the universe.
ISO 8601 allows for 2 or 6 digit years. Truncating to 2 is just incorrect, and 6 digits is absurd. And you can read the RFC without paying ISO - and you can discuss the RFC with people who have also read it instead of relying on people using the Wikipedia page to interpret and explain ISO 8601.
I have a scheduling service at work and I keep getting requests for implementing ISO 8601 timestamps but I ignore them. RFC3339 is the way forward.
Conversion to UTC is not injective e.g. when clocks change or politics happen
UTC only loses information.
Either use dedicated "from_iso8601" functions, or manually specify the format of the input string ("%Y%m%dT%H%M%SZ")
> never ever use anything but ISO dates in UTC tz unless you're displaying it for a user in a UI.
Not good for storing future meeting times. DST switchover dates can change, and your tz-normalized date won't change with it.An engineer in the US reviewing industrial measurements logged in a plant in Asia from a variety of sources is definitely going to encounter lots of events recorded in local time. It would be maddening for that engineer to have to review and resolve events from different time coordinates, especially if they are doing the review months or years later. It's best to accept that reality and adopt local time as the standard. Then you must record the TZ offset per UTC in any new system you create.
I use ISO for everything and your software wrongly assuming I want a deranged lunatic date format based on some locale is not going to cut it.
Locale is ok as a first guess, but maybe allow users tho make that choice?
Special shoutouts to the author of node-postgres saying the PG's date type is better not used for dates in this case.[1] I love programming.
[1] https://node-postgres.com/features/types#date--timestamp--ti...
Is it sane? Is midnight at the start of a day, or the end of it? I'd think noon would be less ambiguous, and significantly less prone to these timezone issues (although this may not be a benefit).
Midnight at the start of the day: 00:00:00
Midnight at the end of the day: 24:00:00
Because the other languages & with bigger runtimes and more comprehensive standard libraries such as Java applets, Microsoft Silverlight, Macromedia Flash that were promoted for browsers to create "rich fat clients" were ultimately rejected for various reasons. The plugins had performance problems, security problems, browser crashes, etc.
Java applets was positioned by Sun & Netscape to be the "serious professional" language. Javascript was intended to be the "toy" language.
In 1999, Microsoft added XMLHttpRequest() to IE's Javascript engine to enable Outlook-for-web email that acted dynamically like Outlook-on-desktop without page refreshes. Other browsers copied that. (We described the early "web apps" with jargon such as "DHTML" DynamicHTML and "AJAX".) In 2004, Google further proved out Javascript capabilities for "rich interactive clients" with Gmail and Google Maps. Smoothly drag map tiles around and zooming in and out without Macromedia Flash. Even without any deliberate coordinated agenda, the industry collectively begins to turn Javascript from a toy language into the dominant language for all serious web apps. Javascript now had huge momentum. A language runtime being built into the browser without a standard library like Javascript was prioritized by the industry more than the other options like plugins that had a bigger "batteries included" library. This overwhelming industry preference for Javascript happened before Node.js for server-side apps in 2009 and before Steve Jobs supposedly killed Flash in 2010.
The situation today of Node.js devs using npm to download "leftpad()" and a hundred other dependencies to "fill in the gaps" of basic functionality comes from the history of Javascript's adoption.
Java's Date standard lib was awful for 2 decades, so there's no guarantee that a big standard library is a good standard library.
JS was a compromise. It had to be sent out the door quick, it needed to look sufficiently like Java to not upset Sun who were trying to establish Java as the universal platform at the time while not being feature complete enough to be perceived as a competitor rather than a supplement. And it had to be standardized ASAP to pre-empt Microsoft's Embrace Extend Extinguish strategy (which was well on its way with JScript). That's also why it's an ECMA standard rather than ISO despite Netscape not having been based in Switzerland - ECMA simply offered the shortest timeline to publishing a standard.
I think what's more amazing isn't just how we managed to build the bulk of user interfaces in JavaScript but how we Node.js managed to succeed with ECMAScript 3. Node.js was born into a world without strict mode and without even built-in support for JSON: https://codelucky.com/javascript-es5/ - and yeah, ECMAScript 3 was succeeded by ECMAScript 5 not 4 because it took vendors 10 years to agree on how the language should evolve in the 21st century - not only did we build the modern web on JavaScript, we built a lot of the modern web on the version of JavaScript as it was in 1999! Even AJAX wasn't standardized until 2006 when Web 2.0 was already in full swing.
That is not a better world.
also this could've been handled easily by committee(ugh), or a 3rd party open source organization akin to linux foundation that just makes js library that all browser vendors use. Or making just a specification for JS handling things.
you know - like a lot of other languages are handled, including their standard library.
I mean, even Microsoft gave up and just went with Chromium, and they got the definition of almost infinite resources at their disposal.
Effectively if your website doesn't run in Chrome and Safari, it won't be seen by 99% of the market.
[Ok that's enough, ed.]
What I'm looking for is "there has to be a library function for that; I would look it up".
JavaScript even with all its thorns, it's a very consistent ecosystem nowadays, across browsers and architectures. Being forever retro-compatible is a good thing, meaning that most code that ran during the 2000s, can still be run with minimal change.
By that, I don't mean to dismiss the importance of backward compatibility, but this case is particularly funny because:
1. It had already been changed multiple times, each a breaking change, so it’s not like this form of compatibility was ever seriously respected;
2. Having it behave differently from other "legacy forms," like the slash-separated version, is itself arguably a break in backward compatibility;
3. As noted in the article, it never worked the same between Chrome and Firefox (at this point) anyway, so it’s doubtful how impactful this "breaking change" really was, considering you already had to write shim code either way.
console.log(new Date('2025/05/28').toDateString());
console.log(new Date('2025-05-28').toDateString());
console.log(new Date('2025-5-28').toDateString());
OutPut Below
Wed May 28 2025 debugger eval code:1:9 Wed May 28 2025 debugger eval code:2:9 Wed May 28 2025 debugger eval code:4:9
We swedes use standardized ISO 8601 dates such as YYYY-MM-DD as dictated by our excellent government and you find it in use in our social security number, government correspondence and mostly everywhere.
Same here in germany! ...Which is the reason why everyone ignores it in favour of the traditional format.
I love democracy, and also mountain-shaped temporal unit ordering ^ It's 28.05.2025 13:15.
Text-ordering by date is a nightmare because everything is first grouped by day-of-month, then month, then year! :)
Falsehoods programmers believe about time gist.github.com/timvisee/fcda9bbdff88d45cc9061606b4b923ca
- Understand the semantic difference between a timestamp (absolute time) and clock/calendar time (relative time). Understand which one your use case uses. Don't use one to store the other.
- If the use case calls for a relative time, do not manually construct or edit the date. Use your platforms date-creation/modification APIs, no matter how unnecessary they seem.
- Understand what is inside your platform's date types at rest. Understand which of your platform's date APIs pull in environmental information (time/tz/locale), as opposed to only using the arguments you pass it. Understand that your platform's 'print/stringify' function may be one of those aforementioned functions. Misunderstanding this often leads people to say inaccurate things. E.g. say your platform has a Date object that stores an epoch-based timestamp. People may say "the Date object is always in UTC", when really the Date object has no time offset, which is not the same thing.
- Understand that if you pass a date around platforms, it might accidentally be reserialized into the same absolute time, but a different relative time.
- Understand that there is a hierarchy of use cases, where each one has more complex requirements:
1. "Create/modify" timestamps; egg timers. (absolute time)
2. Alarm clocks (same clock time always).
3. One-time calendar events (has an explicit, static tz; same clock time if the user changes its day or time zone; different clock time if the user's time offset changes)
4. Recurring calendar events (same as above, except don't change the clock time if the user's time offset changed due to DST, as opposed to a geographic change)
5. Recurring calendar event with multiple participants (same as above, just remember that the attached tz is based on the creator, so the clock time will shift during DST for participants in a place without matching DST rules).
Note that a lot of platforms nowadays have built-in or 3rd party packages that automatically handle a lot of the rules in the above use cases.
Finally, understand that all those little weird things about dates (weird time zones, weird formatting conventions, legislative time zone changes, retroactive legislative time zone changes, leap days, leap seconds, times that don't exist), are good to know, but they will mostly be accounted for by the above understandings. You can get into them when you want to handle the real edge cases.
Being available everywhere (as far as browsers are concerned) trumps almost all other factors.
The real "bug" in the example is 2025/05/28 being May 28th because the implementation ignores timezones for that format.
The issue with `Date` is that it is based on the original `java.util.Date` class and inherits all of its problems: https://docs.oracle.com/en/java/javase/21/docs/api/java.base... - this is also where the wonky zero-indexed month value comes from. Note that Java deprecated all versions of the constructor other than the one taking a millisecond value or nothing, which JS can't do for backwards compatibility reasons.
Hopefully `Temporal` will solve these problems but how long that spec has been in the works should tell you how difficult this is to get right when anything you put on the web is forever.
unless a timezone/offset is given, its considered plain date/time, not an epoch.