Decimal Time

Decimal Time

As an American, I have always been a bit ambivalent when it comes to units of measurement. I learned units like inches, pints, and pounds first, but all through elementary and secondary school, the metric system (or S.I., Système International) was taught, along with dire warnings that we’d better get used to the new measurements because the United States was going to be giving up Imperial units Real Soon Now. That would have been fine with me, because I’m fluent in meters, liters, and grams too, and they all make more sense to me than their Imperial counterparts. (Temperature, strangely, is the exception: I can’t seem to switch my brain out of Fahrenheit.) The entire world—excluding us wacky Americans—has come to the sane conclusion that units of measurement based on outdated and arbitrary standards should be abandoned, and that everything should be based on easy-to-calculate units of ten.

Everything, that is, except time, the measurement of which requires dealing in inconvenient quantities such as 60, 12, 7, 365, 31, 30, 28, and every so often, 29 and 366. Why shouldn’t time be measured in units of 10, 100, and 1000? Seconds, hours, weeks, and months, after all, are simply arbitrary divisions of days, seasons, and years. Why not divide them up in a decimal-friendly way? And why not choose a system that is inherently to from stupid computer glitches on the one hand, and free from religious biases on the other? It turns out that there have been numerous proposals to do exactly that.

Days (and Years) of Our Lives

Let’s back up a bit and consider a few basics. Everyone agrees that time measurements should be based on regular, observable phenomena such as the dependable fact that the sun rises and sets every day, and that the Earth’s position relative to the sun follows predictable, year-long cycles. One could argue that the notion of a “day” having a fixed duration is a bit of a fiction, since the hours of sunlight vary according to season and latitude, but I think most people are content taking an average (i.e., a mean solar day) as the rule. And of course there’s the whole leap year problem, but that need not hold up an entire timekeeping revolution. Though the idea of a “day” and “year” are with us to stay, however, all the other units—seconds, minutes, hours, weeks, and months (and even seasons, depending on where you live)—are arbitrary divisions that are ripe for revision.

The first serious attempt to slice up the clock and calendar decimally happened in France as a consequence of the French Revolution. The new government instituted a republican calendar that consisted of 12 months of 30 days each, months bearing names suggestive of the season in which they fell (but only, of course, in France). An extra five days of festivities were added at the end of each year (not part of any month) to make the solar cycle work out. Each month consisted of three “dekades,” or 10-day weeks. New clocks had to be designed and built, too. A day now had 10 hours; hours had 100 minutes, and minutes had 100 seconds. Because the months were not that much different from existing months (breaking the strict unit-of-10 rule), they were relatively easy to get used to. But having a “minute” that was almost a minute and a half long, and an “hour” that lasted almost two and a half hours, was too much. The republican government fought a losing battle to institute the new timekeeping system from 1793 until 1805, when it was finally abandoned.

Over the years, numerous other proposals have been advanced for dividing time into units of 10, with the common thread being that there’s always a basic unit of time (whether or not it’s called an “hour”) that lasts 1/10 of a day. To deal with the problem of that being a rather unwieldy period of time, smaller units have been proposed, such as the “centiday” or “decihour,” which would be 1/100 of a day, or about 14 minutes according to current measures. Multiples of 2 and 4 centidays are close enough to current half-hours and hours to give a reasonable means of making mental conversions.

.Beat It

One exception to the “centiday” solution was Internet Time, a standard briefly promoted by Swiss watchmaker Swatch. In Swatch’s system, the day was divided evenly into 1000 units called “.beats”; each .beat lasts 1 minute, 26.4 seconds. Internet Time was designed to be universal, rather than local—so if you say an event is going to occur at @435 .beats (which is how Internet Time was notated), that would represent a fixed time that works anywhere in the world. Beat 0 was defined as midnight in Biel, Switzerland, where the Swatch headquarters is located. The downside to the lack of time zones, of course, was that Internet Time had no consistent relationship to the cycle of the sun; you simply had to memorize what .beat range constituted periods such as “morning,” “afternoon,” and “evening” in your local area—and then recalculate if you travel. A bunch of Swatch watches, and a few other devices, displayed Internet Time back in the late 1990s and early 2000s, but to say the least, it didn’t catch on.

Internet Time illustrates another problem with any decimalized time system: where and when do you start counting from? By international agreement, all time zones around the world are calculated based on Greenwich Mean Time, that is, the time on an arbitrary line of longitude running through Greenwich, England, that we’ve designated the Prime Meridian. But just as Internet Time measured from a different starting point, any decimal time measurement must declare some location as the “starting point,” and either calculate local time zones accordingly or bypass the whole notion of local time as Swatch did and let everyone fend for themselves.

Remembering Z Day

In decimal time schemes that also deal with weeks, months, and years, there’s an even trickier problem. What should count as day 0? In other words, let’s say decimal dates were represented as YYYY-MM-DD, with more Y’s added as needed. Which date is 0000-01-01? Some would say, sync it up with the Gregorian calendar—but that perpetuates its Christian bias. Others say, pick a date, any date (such as midnight on January 1, 2000 or July 13, 1903) and just deal with it; presumably, events that occurred before that date would have to be represented with a negative number.

In the abstract, I find the notion of decimal time somewhat appealing, though it would appeal more if a year were, say, 100 or 500 or 1000 days long—that would make the math work out much more conveniently. (Some proposals, by the way, try to divide the year up into 100 “days” of about 88 standard hours each, which is better for calculations but considerably worse for human beings.) But if meters and liters are a tough sell in the United States, metric time is going to be tougher still. Maybe in a century or two…hmmm, century.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on September 2, 2003, and again in a slightly revised form on August 25, 2004.

Source link