[time-nuts] What is "accuracy"? (newbie timenut, hi folks!)

BJ catgirl at bordernet.com.au
Thu May 5 21:34:04 EDT 2016


Hi Time Nuts,

 

I'm fairly new to the fascinating world of time and frequency, so I
apologise profusely in advance for my blatant ignorance.

 

When I ask "what is accuracy" (in relation to oscillators), I am not asking
for the textbook definition - I have already done extensive reading on
accuracy, stability and precision and I think I understand the basics fairly
well - although, after you read the rest of this, you may well (rightly)
think  I am deluding myself. It doesn't help matters when some textbooks,
papers and web articles use the words precision, accuracy and uncertainty
interchangeably. (Incidentally, examples of my light reading include the
'Vig tutorial' on oscillators, HP's Science of Timekeeping Application note,
various NIST documents including the tutorial introduction on frequency
standards and clocks, Michael Lombardi's chapter on Time and Frequency in
the Mechatronics Handbook and many other documents including PTTI and other
conference proceedings). Anyway, you can safely assume I understand the
difference between accuracy and precision in the confused musings that
follow below.

 

What I am trying to understand is, what does it REALLY mean when the
manufacturer's specs for a frequency standard or 'clock' claim a certain
accuracy. For ease and argument's sake let us assume that the accuracy is
given as 100 ppm or 1e-4 ....  

As per the textbook approach, I know I can therefore expect my 'clock' to
have an error of up to 86400x1e-4= 8.64 s per day.

 

But does that mean that, say, after one day I can be certain that my clock
will be fast/slow by no more than 8.64 seconds or could it potentially be
greater than that? In other words, is the accuracy a hard limit or is it a
statistical quantity (so that there is a high probability that my clock will
function this way, but that there is still a very small chance (say in the
3sigma range) that the error may be greater so that the clock may be
fast/slow by, say, 10 seconds)? Is it something inherent, due to the nature
of the type of oscillator (e.g. a characteristic of the crystal or atom,
etc.) or does it vary so that it needs to be measured, and if so, how is
that measurement made to produce the accuracy figure? Are environmental
conditions taken into account when making these measurements (I am assuming
so)? In other words, how is the accuracy of a clock determined? 

 

Note that I am conscious of the fact that I am being somewhat ambiguous with
the definitions myself. It is my understanding that the accuracy (as given
in an oscillator's specs) relates to frequency - i.e. how close the
(measured?) frequency of the oscillator is to its nominal frequency - rather
than time i.e. how well the clock keeps time in comparison to an official
UTC source.... but I am assuming it is fair to say they are two sides of the
same coin. 

 

Does accuracy also take stability into account (since, clearly, if an
oscillator experiences drift, that will affect the accuracy - or does it?)
or do these two 'performance indicators' need to be considered
independently? 

 

I am guessing that the accuracy value is provided as general indicator of
oscillator performance (i.e. the accuracy does REALLY just mean one can
expect an error of up to, or close to?, a certain amount) and that stability
(as indicated by the ADEV) is probably more significant/relevant.

It is also entirely possible I am asking all the wrong questions. As you can
see, confusion reigns. I am hoping things will become clearer to me as I
start playing around with hardware (fingers and toes crossed on that one).

 

In the meantime, if anyone could provide some clarity on this topic or set
my crooked thinking straight, my gratitude will be bountiful. 

 

Thanks.

 

Belinda

 



More information about the time-nuts mailing list