|Subject:||Re: [Discuss-gnuradio] [USRP-users] How to correct for the drift in an (FMCW) Rx signal?|
|Date:||Tue, 02 Dec 2014 16:36:54 +0100|
|User-agent:||Mozilla/5.0 (X11; Linux x86_64; rv:31.0) Gecko/20100101 Thunderbird/31.2.0|
Ok, three things: |
1. there are daughterboards with temperature sensors; search for temperature in
(I think it's tvrx2)
2. there are auxiliary ADCs. If you use a daughterboard that exposes these pins, you can use them with a PTC or something equivalent.
3. If your daughterboard exposes that (there are quite some that do), you can bit-band GPIO lines to talk to I2C temperature sensors.
On 12/02/2014 04:06 PM, khalid.el-darymli wrote:
Hi Marcus, Is there a temperature sensor on-board the N200 unit? If not, does it support installing any such sensor? Thanks. Best regards, Khalid On Fri, Nov 28, 2014 at 5:44 PM, Marcus D. Leech <address@hidden> wrote:On 11/28/2014 03:41 PM, khalid.el-darymli wrote: Back to my original question, what should I do to correct for this? Thanks in advance. Best, Khalid Khalid: Thanks very much for the very-extensive data. My main concern, as one of the Ettus support team, was that there was something wrong with the hardware, but the magnitude of both the apparent phase and magnitude drift is entirely consistent with analog-hardware temperature effects, unrelated to clock stability, etc. Coax cables, for example, will change their loss characteristics and *effective length* with temperature, so with precise hardware like USRPs, it's easy to see these effects. FMCW radar isn't my area of expertise, so hopefully others can comment on RX-processing strategies to deal with this, as it *must* also be a problem with non-SDR FMCW radar implementations. On Fri, Nov 28, 2014 at 12:08 PM, <address@hidden> wrote:What is the magnitude of the frequency drift? What is the magnitude of the gain drift? What are you measuring backscatter *from*? On 2014-11-28 10:14, khalid.el-darymli via USRP-users wrote: Hi, Given a set of synced *(i.e., using external GPS REF/PPS)*, time-commanded and calibrated *(i.e., through compensating for the phase/mag offset between digital Tx chirp prior to transmission and ADC'ed Rx signals) *N200 devices with LFTX/LFRX daughterboards, that work with coherent LFMCW chirps, I am still seeing a tiny drift (both in the magnitude and frequency) of the calibrated back-scatter Rx chirp received at time t1 when compared to an Rx chirp received at an earlier time t0. The more the N200 device runs (e.g., 5 hours), the greater the drift is. Obviously, this drift is pertinent to both the DAC and ADCs and the GPS referenced clocks of the N200 devices. My questions are: 1- Why I still see such drift although my devices are synced with an external GPS? and how do I correct for it? 2- Can the *PLL Carrier Tracking *block in GRC be used to track and correct for such a drift? If so, how do I set the max/min freq inputs for this block? 3- Can *AGC2* or *AGC3 *block be useful in this regard? If so, are there any examples to explain how the input parameters of these blocks can be set up? Thanks. Best regards, Khalid _______________________________________________ USRP-users mailing address@hidden://lists.ettus.com/mailman/listinfo/usrp-users_lists.ettus.com-- Marcus Leech Principal Investigator Shirleys Bay Radio Astronomy Consortiumhttp://www.sbrac.org
|[Prev in Thread]||Current Thread||[Next in Thread]|