On 12/02/2014 10:36 AM, Marcus Müller wrote:
Ok, three things:
I had forgotten that the TVRX2 had an on-board temperature sensor.
It's the only one that does.
1. there are daughterboards with temperature sensors; search for
(I think it's tvrx2)
there are auxiliary ADCs. If you use a daughterboard that exposes
these pins, you can use them with a PTC or something equivalent.
It's probably less-painful to just integrate a USB-based temp sensor
into an application, to be honest....
3. If your daughterboard exposes that (there are quite some that
do), you can bit-band GPIO lines to talk to I2C temperature
On 12/02/2014 04:06 PM,
Is there a temperature sensor on-board the N200 unit? If not, does it
support installing any such sensor?
On Fri, Nov 28, 2014 at 5:44 PM, Marcus D. Leech <address@hidden> wrote:
On 11/28/2014 03:41 PM, khalid.el-darymli wrote:
Back to my original question, what should I do to correct for this?
Thanks in advance.
Thanks very much for the very-extensive data. My main concern, as one of
the Ettus support team, was that there was something wrong with
the hardware, but the magnitude of both the apparent phase and magnitude
drift is entirely consistent with analog-hardware temperature
effects, unrelated to clock stability, etc.
Coax cables, for example, will change their loss characteristics and
*effective length* with temperature, so with precise hardware like USRPs,
easy to see these effects.
FMCW radar isn't my area of expertise, so hopefully others can comment on
RX-processing strategies to deal with this, as it *must* also be a problem
with non-SDR FMCW radar implementations.
On Fri, Nov 28, 2014 at 12:08 PM, <address@hidden> wrote:
What is the magnitude of the frequency drift?
What is the magnitude of the gain drift?
What are you measuring backscatter *from*?
On 2014-11-28 10:14, khalid.el-darymli via USRP-users wrote:
Given a set of synced *(i.e., using external GPS REF/PPS)*,
time-commanded and calibrated *(i.e., through compensating for the
phase/mag offset between digital Tx chirp prior to transmission and ADC'ed
Rx signals) *N200 devices with LFTX/LFRX daughterboards, that work with
coherent LFMCW chirps, I am still seeing a tiny drift (both in the
magnitude and frequency) of the calibrated back-scatter Rx chirp received
at time t1 when compared to an Rx chirp received at an earlier time t0.
The more the N200 device runs (e.g., 5 hours), the greater the drift
is. Obviously, this drift is pertinent to both the DAC and ADCs and the GPS
referenced clocks of the N200 devices.
My questions are:
1- Why I still see such drift although my devices are synced with an
external GPS? and how do I correct for it?
2- Can the *PLL Carrier Tracking *block in GRC be used to track and
correct for such a drift? If so, how do I set the max/min freq inputs for
3- Can *AGC2* or *AGC3 *block be useful in this regard? If so, are
there any examples to explain how the input parameters of these blocks can
be set up?
USRP-users mailing address@hidden://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com
Shirleys Bay Radio Astronomy Consortiumhttp://www.sbrac.org
Discuss-gnuradio mailing list
Discuss-gnuradio mailing list
Shirleys Bay Radio Astronomy Consortium