[time-nuts] Thunderbolt GPS TimeKeeper

David J Taylor david-taylor at blueyonder.co.uk
Tue Jan 24 10:20:28 UTC 2012


> Ok, about half of the Windows servers have 0.5 millisecond offsets and
> some have 3 millisecond offsets.   I still call that "millisecond
> level".    That is very different from "microsecond level".  To me the
> terms "milli level", "nano level" and so on mean "round to the nearest
> three orders of magnitude."   It is a VERY course level of rounding
> even more so than"rough order of magnitude" type rounding.       So I
> think "micro second level" means not "within one uSec but "you are
> using uSec as your units when you tell people the number"
>
> For most casual users even 100mS is better than they need.   No one
> notices if you are 100mS late for a meeting.
>
> Chris Albertson
> Redondo Beach, California

Thanks, Chris.

  http://www.satsignal.eu/mrtg/performance_ntp.php

The graphs may be confusing at first glance.  To overcome the limitation 
of MRTG of only plotting positive numbers, I needed to add a fixed amount 
to the reported offset so that it becomes a positive value.  I chose the 
offset and scaling values so that a server keeping perfect time would be a 
line right down the centre of the graph.  The offset reported by NTP is 
the deviation from that centre-line.  The left axis annotation states this 
offset.

FreeBSD PC Pixie shows deviations around 10 microseconds from nominal, 
mostly caused by the heating switching on.

Windows XP PC Feenix shows maximum deviations of around 200 microseconds.

Windows-7/32 PC Stamsund shows maximum deviations which are likely under 
50 microseconds.

The scale on the graphs for PCs Alta and Bacchus does not allow the 
deviations to be estimated accurately (as they were not originally 
intended as stratum-1 servers), but the deviation may well be under 0.3 
ms.

A couple of these PCs have speaking clocks (software) running, and 
multiple speakers speaking at differing times is rather distracting.  They 
are also monitoring a satellite data broadcast, where event times are 
compared across PCs, and across Europe, and having logs time stamped 
reasonably accurately can help.  Other folk are using PCs to measure the 
propagation delay of radio signals where, I understand, millisecond 
accuracy is the level talked about.

Colloquially, I might say: FreeBSD tens of microseconds, Windows hundreds 
of microseconds or perhaps Windows tenths of milliseconds.

Cheers,
David
-- 
SatSignal software - quality software written to your requirements
Web:  http://www.satsignal.eu
Email:  david-taylor at blueyonder.co.uk 




More information about the time-nuts mailing list