mail-ng
[Top] [All Lists]

Re: Dates: the can of worms

2004-05-01 07:47:40

On Sat, 1 May 2004 22:47, Iljitsch van Beijnum wrote:
That remains to be seen... I agree _today_ it doesn't matter much, but
we're moving into a direction of more and more accurate time keeping. I
can imagine a situation where the emission of packets to the network is
done at very exact times to allow for better quality of service. A 0.9
second time difference in various places would be deadly here.

This is why you would use MJD-timestamps for relatively low resolution tasks 
(where error margins of a minute don't matter). The "Date" header in an email 
is a good example: its primary purpose is to indicate the day and time-of-day 
that a message was created. Any nitpicking at the second level really misses 
the point.

For network synchronisation, the timestamp doesn't need to be converted into a 
calendar-based format at all, and a raw number of atomic seconds can be used. 
Following the format of the MJD proposal, you could have a TAI epoch and 
units of atomic seconds, like "TAI+123456.789". Dan Bernstein has already 
proposed a conceptually similar thing called "taiclock" (which is primarily 
targeted as a replacement for NTP).

These things can be quite independent of each other, and used for distinct 
purposes. If desirable, you can have a function which converts (or 
approximates) one time from another.

The general epoch-offset format makes timestamps trivial to generate, trivial 
to parse, trivial to compare magnitude (sort), and trivial to use in 
time-interval computations (with results in the same unit as the format). 
Using units of solar days (as for my MJD proposal) makes dates immune to 
calendar changes, up to and including the introduction of Metric Time and the 
Ten Day Week. Using units of atomic seconds (as for the TAI suggestion) 
optimises the format for real time sub-second accuracy, not conversion to 
civilian dates.

This sounds like the way we simple humans like to use time.
Unfortunately, I never see "UT1" anywhere, just "UTC" and:

Yeah, but every wall clock and wrist watch you ever see works on the basis of 
86400 seconds per day. That's not UTC. Given two extremely accurate clocks 
running UT1 and UTC respectively, you'd note that there was less than a 
second's difference between them, and you'd only be able to figure out which 
one was the UTC clock if you saw a leap second like 23:59:60.

Dates are a can of worms. You can go nuts arguing about them. But think of how 
often people hard-code an assumption like "there are exactly 60 seconds in a 
minute". Note that it's not a true assumption in UTC calculus.

A date can be expressed in the form "MJD+53125.1666",
and this would have meaning to historians who already 
use MJD.

How many bits does it use?

As a variable length string representation, the example I gave was 14 bytes, 
but the length is generally variable, depending on how precise you want to 
be. If you parse it into a standard double precision floating point number, 
you'll use 8 bytes of memory (64 bits) and have at least one-second precision 
for epoch plus or minus about a billion years.


<Prev in Thread] Current Thread [Next in Thread>