I think I might not be getting my point across. Assuming more like
4,800 to 38,400 baud then there should be variation of only one
character arrival uncertainty to the timestamp of the start of the
ZDA sentence. However the end of the sentence could have much
larger jitter, up to 1ms (eg consider we read only the very last
character and the rest of the buffer is empty)
So my jitter is currently 1ms ish, but I believe it should be
possible to reduce that to 0.5ms. Do you agree?
No, because USB "jitter" is random 0-1mS. It will set the interrupt condition when it sees the character arrive, but it has to poll and it only polls every 1mS. There is no way to average it out
First question though - did I correctly understand the current gpsd
Yes, but it creates an offset, It remembers the time of something early but it doesn't try to compensate for baud rate, and may remember the end of sentence.
I've done similar experiments. There is an OFFSET because of
the latency to the last sentence, but the jitter is consistent
with USB jitter. 115200 baud is fast enough that it takes a
10 character difference in the messages to add up to 1mS. You
might want to try 230400, though the Venus meters the
characters out at less than full speed.
*IF* not just the initial bit, but in fact if every bit of the
Venus 6 output were of low jitter, then because we collect
multiple observations of the serial output via USB, then it
should be possible to improve our estimate of the arrival
timestamp below the 0.5ms mark. ie we can observe the number
of characters read on each USB timestamp, compare with our
predicted number of characters we should be able to get sub ms
estimates of the arrival time of a particular character - use
that to work back and get the arrival time of the first bit.
Note that if feasible, this technique would give better
accuracy than PPS over USB !
You cannot measure the error. It is random and not a normal distribution.
Still think I didn't explain my idea correctly.
My Venus delivers characters at 9,600 baud, ie 1,200Hz, lets assume
for the sake of argument that the jitter on the arrival time of each
character is very low. The USB bus returns results approximately
ever 1ms, but with high jitter, eg might be 1.5ms, then 0.5ms, then
Assuming that the USB timestamp is accurate, ie it might be 1.5ms
since the last read, but we can measure that gap accurately, then we
can predict how many characters should be in the buffer. Algorithm
has one unknown, timestamp of first character. Number of characters
in each read at 1024Hz will depend on the timestamp of the first
character and the delivery rate (1,200Hz in this case).
In fact if we maintain our estimate between loops it should be
possible to setup a PLL which actually has quite high degree of
accuracy in predicting the arrival time of each character.
No, the USB "timing" is equivalent of white noise. The timestamp of the USB will be 0-1mS from when the real interrupt occurs with no way of calculating or predicting it.
Remember if the assumption holds that we know the timestamp of the
low frequency USB observations, then it doesn't in fact matter so
much that they are at fairly random intervals, they still allow us
to observe the underlying process with much higher accuracy. In
fact more USB jitter is actually beneficial because it will help
observe the underlying process...
The kernel adds its own jitter. Small but nonzero
I'm kind of intrigued that others haven't jumped in on this - on the
surface it would appear that the PPS is irrelevant for USB - we can
gain far higher accuracy using a PLL and observing the NMEA data
(caveat this is true only on certain chipsets, eg many Venus). Or
equivalently you want a 10-20Hz PPS for usb?
You can't do a PLL if you don't have an accurate edge to determine the phase error, and that edge is accurate to worse than 1mS.