[time-nuts] Question about frequency counter testing

Oleg Skydan olegskydan at gmail.com
Wed Jun 6 08:53:26 EDT 2018


Hi, Magnus!

Sorry for the late answer, I injured my left eye last Monday, so had very 
limited abilities to use computer.

From: "Magnus Danielson" <magnus at rubidium.dyndns.org>
> As long as the sums C and D becomes correct, your
> path to it can be whatever.

Yes. It produces the same sums.

> Yes please do, then I can double check it.

I have write a note and attached it. The described modifications to the 
original method were successfully tested on my experimental HW.

> Yeah, now you can move your harware focus on considering interpolation
> techniques beyond the processing power of least-square estimation, which
> integrate noise way down.

If you are talking about adding traditional HW interpolation of the trigger 
events I have no plans to do it. It is not possible to do it keeping 2.5ns 
base counter resolution (there is no way to output 400MHz clock signal out 
of the chip) and I do not want to add extra complexity to the HW of this 
project.

But, the HW I use can simultaneously sample up to 10 timestamps. So, I can 
push the one shoot resolution down to 250ps using several delay lines 
(theoretically). I do not think that going down to 250ps has much sense 
(also I have another plans for that additional HW), but 2x or 4x one shot 
resolution improvement (down to 1.25ns or 625ps) is relatively simple to 
implement in HW and should be a good idea to try.

>> I will probably throw out the power hungry and expensive SDRAM chip or
>> use much smaller one :).
>
> Yeah, it would only be if you build multi-tau PDEV plots that you would
> need much memory, other than that it is just buffer memory to buffer
> before it goes to off-board processing, at which time you would need to
> convey the C, D, N and tau0 values.

Yes, I want to produce multi-tau PDEV plots :).

They can be computed with small memory footprint, but it will be non 
overlapped PDEVs, so the confidence level at large taus will be poor (with 
the practical durations of the measurements). I have a working code that 
realizes such algorithm. It uses only 272bytes of memory for each decade 
(1-2-5 values).

I need to think how to do the overlapping PDEV calculations with minimal 
memory/processing power requirements (I am aware that decimation routines 
should not use the overlapped calculations).

BTW, are there any "optimal overlapping"? Or I should just use as much data 
as I can process?

> Please report on that progress! Sounds fun!

I will drop a note when I will move on the next step. The things are a bit 
slower now.

Thanks!
Oleg 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: Efficient C and D sums calculation for least square estimation of phase, frequency and PDEV.pdf
Type: application/pdf
Size: 17890 bytes
Desc: not available
URL: <http://www.febo.com/pipermail/time-nuts/attachments/20180606/482a9543/attachment.pdf>


More information about the time-nuts mailing list