[time-nuts] Time Interval Algebra?
bill at iaxs.net
Mon Dec 20 00:01:10 EST 2004
Why is it important to get every measurement? If I turn the counter on
Monday afternoon, get a reading of 0.500000 seconds, turn it off and
next Monday afternoon get a reading of 0.500027, then I have an offset
or error of 27 +/- 1 microseconds in 7 days. Each reading has been accurate
to 1 microsecond, which requires that the counter clock drift less than
0.1 microsecond per second.
I suppose that the problem is that this doesn't allow an Allan Variance
calculation based on one calculation per second. Why must it be one per
second, not one in 10 or 100 seconds? Or why not 100 per second?
A larger problem may be the variance in the counter that is dividing the
Rb output into 1 pulse per second. Are the counters all clocked by the
Rb clock or is it a simple cascade of decade counters?
More information about the time-nuts