[time-nuts] Time syncing question

Tom Van Baak tvb at leapsecond.com
Mon Aug 28 03:42:23 EDT 2006


> I have a piece of equipment that has its time set either via a
> serial port or via HTTP. After initial setting, the device keeps
> its own time (although badly). The device has no NTP capability
> itself. Although I'm describing a particular piece of equipment,
> I have seen many other devices that work the same way.

This is so true. I have a growing number of serial and
LAN devices in my lab that keep their own idea of time
of day and it's a pain.


> I want to write a program that runs on a standard PC and
> communicates the time to the device in question. The
> communication between the PC and the target device won't
> be a problem. The PC would be syncing it's time via NTP.
> The target device wouldn't need it's clock synced better than
> 100-200 mS or so.

Kudos to you for even considering this. So many devices
seem use the "set and forget" method of timekeeping.


> I'd like to know how to handle syncing the target device's time.
> I know that I want to avoid major time jumps and that once the
> target device is synced, I need to keep things in sync. I know
> that NTPD knows how to do these things, but I don't want to
> delve into the source code of that quite yet. At this point, I'm
> looking for more of a high-level pseudo-code description of how
> this is handled. Any pointers to documentation of this type
> would be appreciated.

The answer kind of depends if you can (or want) to
write code for both the PC and the device or only for
the PC side of the equation.

Let's start with this:

1) If the device has a fixed clock-tick-timer rate and
the PC has only the ability to get and set the device's
clock time, then what you can do is quite limited,
but simple. You set the clock once and then keep
setting it periodically. This sounds stupid, doesn't
it; but do the math before you all flame me.

How frequently you need to reset the device's clock
is based on the maximum clock rate error of the
device and the maximum time error you'll permit.

For example, suppose your device is xtal-based with
an accuracy spec of, say, 100 ppm (generous) and
you want to keep time to within 100 ms (conservative).
At 100 ppm your worst case time drift is 100 us per
second, which is 6 ms per minute; so you need to
reset the device's clock time only 4 times an hour to
guarantee that you meet your goal.

The PC program algorithm is:
    while (1) {
        get accurate pc time
        set device time
        sleep 15 min
    }

Now as an engineer you'll be tempted to write a much
more elaborate algorithm, but my point is, when you do
the math, you realize that even this brain-dead simple
algorithm might in fact be a perfectly good solution in
your specific case.


1b) Same case as above but instead of sleeping a worst
case 15 minutes you sleep even longer between updates.
How long depends on how far off the device time was
prior to setting the device time.


1c) Also same as above but if you need to set the device
time more gradually the *first* time to avoid "major time
jumps" then you set it a little closer to the correct time
every N seconds. You pick what slew rate you want.


2) If the device has a variable clock rate then you can do
something more clever (both time and drift removal). This
is a little more NTP-like. It does require code support
in both the device and the PC program. I'll write up this
case if you want.

/tvb






More information about the time-nuts mailing list