[time-nuts] resolution measurements

Brian Kirby kirbybq at bellsouth.net
Sat Feb 12 20:07:35 EST 2005


One of you brought up topics about resolution measurements.  I do not 
keep a lot of emails, trying to be neat and tidy...!

I'll pass on what I was taught in the Air Force.  Old School.  If you 
want to resolve 1 nanosecond in 1 second, the time base needs to be 
stable to 1x10-9 in that one second.  If we go for that same nanosecond 
in 10 seconds, it increases to 1x10-10, 100 seconds would be 1x10-11, 
1000 seconds 1x10-12, etc.  These folks also taught to derate the 
measurements by a magnitude.  If you measured to 1 nanosecond, you 
reported to the 10 nanosecond level, etc.  The logic was to drop the 
last digit of the counter reading.  I worked for a satellite carrier one 
time and they specified that the circuit would have a BER of 1x10-7 or 
better, and since we are selling this to a customer, it better read 
1x10-8, and to believe the test equipment, we did testing to 1x10-9 
(back when a dedicated 56Kb circuit was fast and dial up was 300 Bps !).

Now days they are doing an uncertainty analysis and it finds resolution 
and noise floor.  New School.  There is a paper out there by Fluke who 
compares a GPS DO to a Sultzer oscillator and it goes into the 
uncertainty analysis.  I believe this is the way to go, as you baseline 
your equipment.  There is a flaw in this measurement paper, which is 
what the author is pointing out and he is showing, it was a wrong 
application. 

I started out by running variance test on my rubidium's, GPSDO's, and 
GPS receivers.  I ran about every combination as far as comparisons and 
ran them with time bases referenced to the GPS DO's and the rubidiums.  
I ran 2,6,20,60,200,600, 2000, and 6000 second variances.  I made sure 
that the minimum data sets were at 150 samples, and on the smaller stuff 
I ran 1000s of samples.  The goal was 92 percent confidence in the 
data.  This testing takes a long time, but it will reveal details you 
need.  You may also consider do a zero baseline test on your time 
interval counters at the same variances you took before - just use equal 
length cables and tee off of the same source.  It will reveal system 
noise and the limits of your resolution.

Basically on long term testing I usually use 600 second variances - it 
makes it easy for direct common-view comparisons using NIST GPS data 
archive.  But it also depends on what your measuring and what with.




More information about the time-nuts mailing list