[time-nuts] Down-conversion to IF and sampling

Stephan Sandenbergh ssandenbergh at gmail.com
Sat Dec 23 09:46:30 EST 2017


Hi All,

Consider the following very common scenario: A perfect RF signal is
heterodyne down-converted to baseband using an offset oscillator. Let's
assume this oscillator has x(t) = xo + yot. This produces a time and
frequency offset baseband signal. Then, this baseband signal is coherently
ADC sampled using that same offset oscillator.

What would the effect of this coherent ADC sampling be?

See attached diagram. Here I assumed the ADC timebase is a time-dependent
function of the oscillator offset. However, it feels like I'm making a
logic error? I can't remember ever seeing anyone accounting for the ADC
time-base errors in coherent heterodyne down-converter stages. I have
limited experience though.

Regards,

Stephan.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: img1.png
Type: image/png
Size: 19112 bytes
Desc: not available
URL: <http://www.febo.com/pipermail/time-nuts/attachments/20171223/7eff506c/attachment.png>


More information about the time-nuts mailing list