[Top][All Lists]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [gpsd-dev] [gpsd] Altitude in TPV

From: Gerry Creager - NOAA Affiliate
Subject: Re: [gpsd-dev] [gpsd] Altitude in TPV
Date: Mon, 29 Oct 2018 16:50:00 -0500

On Mon, Oct 29, 2018 at 3:17 PM Gary E. Miller <address@hidden> wrote:
Yo Gerry!

On Mon, 29 Oct 2018 14:44:22 -0500
Gerry Creager - NOAA Affiliate <address@hidden> wrote:

> > I'm currently working on adding RINEX 3 support to gpsd.  At least
> > for the u-blox and GREIS GPS receivers.  That can give cm accuracy
> > when long term data is uploaded for post-processing. 
> Are you recovering carrier or solely code-phase for the three
> frequencies in your dataset?

The u-blox NEO-M8T reports psuedorange, carrierphase and doppler for
GPS, SBAS and GALILEO on the L1 band.  I can't get the M8T to output
anything for GLONASS.  That should be enough for RINEX 3.

Is pseudorange rate in there, as well? If we've got range and phase, we can approximate rate and that'd give us a good shot ad solid 3d postprocessing...
The javad does the same on several bands.

> > Code exists. I'm partial to PAGES, originally by Mark Schenewerk
> > when he 
> was at the National Geodetic Survey HQ in Silver Spring. Most of the
> big surveying companies (Trimble, Astech, Javad, etc.) have only
> proprietary codes to perform the network adjustments. There is MATLAB
> code available to do it, and if we can relocate a copyof PAGES, that
> works, too.

I'll look at it, after I get RINEX working.  I hear RTKLIB is also
worth looking at.

Oh, Dr Mark Shenewerk.  His name is all over OPUS.  The US online
post processing system.  Sadly OPUS doe not work with single band
RINEX.  Some other online post processing does.

Mark worked closely with me when my group did an interesting little survey project in Azerbaijan. I'll have to tell you about it some time. Later. He's a good guy.

Single-freq data didn't provide sufficient information in carrier phase to resolve either horizontal or vertical, save for RTK, and RTK never quite got vertical right although it was better than autonomous code phase L1 only. L5 was added to allow a civil code and carrier capability. L1C/L2C has mitigated that to a great extent. A lot of people have tried to make survey-quality results work with L1/code only, and have reported errors unsupported by repeats of their studies in an objective manner. You can get close, but it's not geodesy at that point.
Here is a page on PAGES:

FORTRAN 77.  Wow.  Serious stuff.

Easy enough even a scientist can learn it.
> I believe the diurnal effects may actually be related to a spherical
> harmonic-induced perturbation to satellite ephemerises. Regardless of
> the start and stop times imposed, I found that observing times of
> 4-11 hours (plus/minus) gave the best results for getting height
> data,

Interesting.  Worth some testing.  Doing 4 hour data captures a lot
easier than 24 hour runs.

But height data is pretty problematic.  I can get many GPS to agree on
lat/lon, and yet report different heights by sometimes over 60 feet.
Same antenna, same measurement time, through a signal splitter.  My guess
is errors in the GPS receiver code for the geoid.

It's actually in the makeup of the constellation. The geoid and ellipsoid are straightforward. There's plenty of computational horsepower and memory for those datasets and calculations in most receivers. I can't speak to the dongles, but even when I started playing with "low cost" board level GPS, the processors were hard-core and had lots of memory for the time.
> However, if the
> observing period was in the range of ~4-11 hours, things were good.

Might be interesting to have gpsprof plot the lat/lon/alt errors
over time.  Take this out of anecdotal evidence to something more

I'll have to find the paper I wrote. Gotta have it somewhere. It was enough to get NGC to create NGS-58 at the time.
> Simple code-phase solutions are temporally autocorrelated.
> Decorrelation by decimation of a long timeseries dataset, measured in
> days, with decorrelation to 45-60 sec, or even longer, will improve
> the long-term average results.

Wow, if it takes days of measurements to average then the post
processing of short data sets looks a lot better.

Over time, sure. That said, 10 or so hours decimated will give you a good first guess, and you can augment it with another dataset, or 3, later improving the estimate. Just remember that your error is a least-squared error instead of simple standard deviation.

Gotta run. More later.
> In years past, I averaged a lot of L1
> code position report averages; the use of decimation was recommended
> to me by one of the NGS geodesists, rather than being independently
> derived. I did expand on the decimation parameters, to optimize the
> result, however.

I guess too much to ask for that code to be put in gpsd?

Depends on how we want to go about it. The decimation approach is a bit memory expensive but not too bad. Or write it to ramdisk or a temp file and process from there in a 2-step fashion.


Gary E. Miller Rellim 109 NW Wilmington Ave., Suite E, Bend, OR 97703
        address@hidden  Tel:+1 541 382 8588

            Veritas liberabit vos. -- Quid est veritas?
    "If you can’t measure it, you can’t improve it." - Lord Kelvin

Gerry Creager
“Big whorls have little whorls,
That feed on their velocity; 
And little whorls have lesser whorls, 
And so on to viscosity.” 
Lewis Fry Richardson (1881-1953)

reply via email to

[Prev in Thread] Current Thread [Next in Thread]