[time-nuts] noise of a 5370

Said Jackson saidjack at aol.com
Sun Mar 13 02:51:01 UTC 2011


Yes, the TI will get more accurate with time. But the setup suggested in previous email doesn't measure TI.

It measures the standard deviation of that TI. Big difference, two completely different numbers. The SD number is essentially a measure of the absolute amplitude of the noise. Since it is an absolute measurement, it doesn't converge to 0ps.

That number will quickly converge to about 20ps to 50ps. It will not get smaller with larger sample sets.

Please check the 5370 manual what standard deviation means, it will make sense then.

Bye,
Said

Sent from my iPad

On Mar 12, 2011, at 9:27, mikes at flatsurface.com (Mike S) wrote:

> At 12:00 PM 3/12/2011, Said Jackson wrote...
>> The average will approach 0.0 as the number of samples is increased, but not the standard deviation.. The value displayed by their unit is standard deviation.
> 
> If you're measuring jitter of an external signal, accuracy is obviously much worse than the internal jitter. But if you're measuring TI, both the instrument and signal jitter get reduced with more measurements, leaving a more accurate TI, doesn't it?
> 
> Being designed and sold as a TI analyzer, that seems right. It seems the jitter test is mostly to make sure the unit is operating properly.
> 
> If I recall correctly, internal jitter is affected by tweaking the 200 MHz multiplier. Lacking a proper spectrum analyzer, that's the only calibration I have been unable to do. 
> 
> _______________________________________________
> time-nuts mailing list -- time-nuts at febo.com
> To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
> and follow the instructions there.



More information about the time-nuts mailing list