I want to write about timestamps. Not because it’s a subject I like, but on the contrary, because it’s a subject I somewhat hate. What is a simple day to day thing for humans is a real source of headaches for programmers.
The list of reasons for that is about as long as the number of permutations programming languages allow you to do while parsing dates with a string format. Just to name a few, symbols can change (“-” vs “/”), order can change, years can either be 4 or 2 digits (probably those born post year 2000 aren’t aware we had a major bug back then), 24 hours vs AM/PM and even locales (English “February” vs French «Février»).
You don’t think it’s a problem? To this date, 18 596 questions bear the tag “datetime” on stackoverflow.com. That’s huge! Still to this date, there are 294 443 questions tagged “c++”. So there are about 15.8 times as many questions for a full language, namely c++, as there are for a what should be a trivial subject: datetime.
Sometimes, you won’t have a choice but to write your custom parsing string made of %Y, %m, %d and so on, mostly when you read/write from/to a human. But for IPC (inter processes communications), especially if those IPC are yours, there is no reason to do that manual crap. Thinking about defining your own standard? Please, please don’t. Standards already exist. The one I prefer is ISO-8601. Simply stated, it’s the biggest value on the left to the smallest value on the right, so year,month,day,hour,minutes and seconds. The date elements are separated by “-“, time elements by “:” and a “T” separates the date from the time.
But, what’s that at the end? Either a “Z” or a “±0:00”. That’s for the time zone. You don’t like time zones or think you’ll never need them? Are you sure? Think about it this way. At 100°, is water boiling? Well it depends. If you are in Celsius, yes. If you are in Kelvins, not at all. Time zones compare to Celsius and Kelvins. They are working on the same scale but define an offset. On an era where everything connects through the internet, an application should never assume that “9h00” is local. Computer science is about being exact, not assuming to be. Otherwise, bugs happen. Here, a simple example where a wrong assumption by StarCluster that the time provided by an application, namely OGS, was UTC led to some computation errors. Would OGS had provided timezone info, that issue wouldn’t have occurred and we wouldn’t have wasted time over it.
I doubt they will ever teach ISO-8601 at school to first graders in order to allow us, programmers, to dump those time format string. They are very likely to always be required for human interaction. Still, sticking to ISO-8601 for IPC should reduce the amount of errors related to date time usage and ease the pain.
Do you know why HTTP headers use dates in such a verbose format? Eg.:
Last-Modified: Wed, 15 Nov 1995 04:58:08 GMT
Aren’t they meant to be consumed by computers? Why are they so verbose? (“Wed”, “Nov”) Why aren’t they ISO-8601? Why would a computer *ever* need to know the day of the week? At least they got the timezone!
I don’t have the complete answer, but looking at RFC-2616 it seems to date back from at least 1999. Section 3.3.1 states
HTTP applications have historically allowed three different formats for the representation of date/time stamps:
Sun Nov 6 08:49:37 1994 ; ANSI C’s asctime() format
The first format is preferred as an Internet standard and represents a fixed-length subset of that defined by RFC 1123  (an update to RFC 822 ). The second format is in common use, but is based on the obsolete RFC 850  date format and lacks a four-digit year. HTTP/1.1 clients and servers that parse the date value MUST accept all three formats (for compatibility with HTTP/1.0), though they MUST only generate the RFC 1123 format for representing HTTP-date values in header fields.
Looking at RFC 822, it dates back from August 13, 1982. The first edition of the ISO 8601 standard was published in 1988, about 6 years later. I guess one doesn’t change a standard so easily.