Select to expand quote
boardsurfr said..sailquik said..
Quote from Manfred: "Running the GW 52 at 1Hz to match the GT31 also makes no sense because of the missing antialiasing filters - the units have to run with the highest frequency possible (GW 52, Thingsee: 10Hz) to avoid this, because simple decimation of the data without filtering renders the data useless..."
There are quite a few statements here that I would like to address - to help keep context/flow of what I am writing, I will address them one at a time.
Select to expand quote
"renders the data useless" is, at best, an extreme overstatement. I would put it into the same category as "speed below 5 knots should not count towards distance". It's a personal opinion, which is based on some rational arguments, but it's not the one and only truth. Remember you are quoting a German engineer!
Ignoring speed below 5 knots has always resulted in a discussion. In particular if you happen to go out for a sail in 10 knots with a 7m sail, then drift down the course -> you *are* still sailing. That is why GPSResults allows you to use the change the filter parameters.
Select to expand quote
I'm pretty fed up with all the "our test show this and that and our rules are based on the results", without anyone ever bothering to make these data available. Setting up a WordPress or Google site to put the results of your tests takes just a few minutes.
All of this work was done by volunteers, Mathematicians, Doctor of Physics, Electronic Engineers, Programmers. What you are asking for is for us to do *more* work.
The information is available - maybe just be polite and ask for it ?!
Select to expand quote
Manfred has been a driving force behind going to higher data rates (5 Hz and 10 Hz). From the (mostly third-hand) talk I heard about this, I understand that the main reason to use higher data rates for measuring straight line speed (everything except alpha) is the reduction of random error through higher sampling rates. Assuming that measurement error is largely random, measuring at 5 Hz could give about 2-fold higher accuracy than measuring at 1 Hz.
Manfred definitely hasn't been the driving force. Since the days of the E-Trex and Foretrex 101, the whole community of GPS-Speedsurfing.com has always wanted high-Hz sample rates. To single out a specific person, does a disservice to the history of how we got here... its just that Tom and Manfred volunteered the most time.
[ Be aware that Manfred built the video-timing stuff used in many speed-sailing events... if anything he has a vested interest in making sure GPS technology *doesn't* replace his existing system. Please dont accuse/single-out a single person as having any other motive other than to better the existing technology. ]
Also, if you consider this forum as third-hand, I would suggest that you are incorrect. Andrew Daff in particular was there at the start, so anything that he has written is indeed first-hand. In my case, I came in late to the party, around the time where the Foretrex 201 was the GPS model of choice.
Select to expand quote
If the GW-52 units would have proper filters when running at 1 Hz, then the accuracy at 1 Hz could be very close to the accuracy at 5 Hz. Simple averages would do most of the trick, Kalman filters (which are typically used in GPS units) would be even better. But I guess the comment about the "missing antialiasing filters" indicates that the GW-52 does not use such filters; instead, it may simply discard 4 of 5 data points, and log the 5th. This is the easiest way to implement 1 Hz recording on 5 Hz units, so there is a good chance it's done this way.
A traditional Kalman filter is an infinite-impulse-response filter - as such, it is very well suited to general purpose filtering of GPS data when used for everyday usage (walking, driving, etc). However it negatively affects short-window measurements - indeed a 1Hz Kalman filter in a consumer grade GPS could have significant sample-bias from measurements taken many seconds prior.
The work on the GT-31 focused around understanding how the 1Hz measurements are created. Using a number of apparatus, we were able to deduce that there was likely to be a Kalman filter [ as opposed to some other filter ] and it and it appears to be modified to cut-off history [ thus making it a finite-impulse-response filter ].
Every digital-system must by definition must include some type of anti-aliasing filter - as that is one [ of the many ] elements of using digital technology to sample the real-world. The statement that you quoted doesn't explain what you are trying to say. Can you elaborate on this point?
One of the statements you make below and in subsequent posts, needs to be mentioned here. The point of high-Hz sampling is specifically "because we dont know what is really happening". When we take a scientific measurement, we must always provide a methodology which includes any assumptions. When we performed the various tests, we did identify that we couldn't see what was happening at the microscopic level (aka sub-1-second). This *is* the driving point behind why we want high-Hz GPS's.
As a corollary... instead of having 1Hz samples, why dont we just use 0.5Hz samples, as that would suffice for the 2-second category? Going to the extreme, why dont we just take the first point and the last point of two samples that are 10 seconds apart? We dont use 10-seconds as it doesn't tell us a) what is our peak speed b) which 10-second window do we choose to start at. Going subsecond is just the same thing, but with a smaller time-window. Thus we use 1Hz because that is the best that the consumer-grade technology could provide in a cheap package.
I mentioned testing... As an engineer you will be aware that it is important that we can implement test cases that accurately reflect the result that we are trying to verify. I list here a basic description of some of the tests performed:
- Walk 10 m watching the GPS, using a tin-can quickly put the GPS inside it, then tun 90deg and continue walking -> after a few seconds remove the GPS from the can. Ideally the track log should show zero/null datapoints as soon as the GPS is put into the can; it should then show the new datapoints "somewhere else". What you *shouldn't see* is the device playing catch-up or any other non-zero-datapoints while it was inside the can. Note here specifically is where Kalman filters arn't necessarily a good thing [ or at least they need to not have long history ]. This idea came from Ian Knight.
- One of the apparatus was a rotating-arm with a variable rotation rate - this was used to measure the aliasing. Manfred. [ I was on the email discusion in the early days and have seen photo's of this rig. ]
- Another apparatus was a linear oscillator - again used to measure aliasing, but also over-shoot. Tom Chalko.
- Tom mounted quite a few GPS's to his car, then drove many hundreds of km's, then performed Fourier analysis on the error values. What this showed was the max-error value quoted by the manufacturers, is actually far worse than what we are statistically able to measure [ with say over a million points ].
... and many others.
Select to expand quote
If we assume that the 1 Hz recording in the GW-52 is indeed implemented in this most simplistic way, the effect of the sub-sampling is that the error would increase by roughly 2-fold (square root of 5). Why anyone would call data with a 2-fold higher error margin "useless" is beyond me. Similarly, it is hard to imagine how the GW-52 can be significantly more accurate than the GT-31 or the Canmore at 5 Hz, but noticeably worse at 1 Hz. The only expected effect of using 1 Hz readings instead of 5 Hz reading would be that some teams might swap places in the monthly 2 second ranking (and perhaps in the alphas). The effect of running GT-31s in "power save" mode sure is a lot larger, and I did not see any warnings against that for years.
I'm not sure if this statement or is it a question - can you rephrase it?
Power-save has *always* been off the cards. This came from the original Foretrex 101 - it is documented everywhere. Ignoring that, if you think about even just a little bit, disabling power-save makes perfect sense vs. requiring timely and accurate data-logging.
Select to expand quote
There are a few practical reasons to use 1 Hz recording, though. The 5 Hz data have a lot of high-frequency oscillations that don't really add to the analysis. In "action replay" mode in GPSAR Pro, you can't really follow the speeds at 5 Hz, the vary too quickly; nor does viewing tracks only around the current point work as well, since the region shown is 5x shorter. But more important is the limited data capacity on the GW-52 of only 129 thousand data points. That's 7.3 hours at 5 Hz, but 35 hours at 1 Hz. When I record at 1 Hz, and can recharge the unit between sessions from any USB charger or an external battery. When I record at 5 Hz, I must hook the unit up to a computer running Windows between sessions, or I loose data - most of my sessions are longer than 4 hours. Depending on where I am, that can be pretty darn inconvenient.
You statement that 5Hz doesn't add to the analysis, really surprised me. The purpose of scientific endeavour is to understand everything - if we stopped when we thought it was good enough, then we wouldn't have identified sub-atomic particles or built the Large Hadron Collider.
Dont confuse data-analysis with data-visualisation. If played with a suitable player, then the higher-Hz GPS just allows you to "zoom in" to your track in real-time! ( Not GPSar as AFAIK it assumes that the the data-play-rate is linked to the frame-rate - they should be independently controllable. )
You are also conflating the memory-storage of a GW-52 specifically, and the ability to more accurately understand what is happening at the microsopic level by catpuring more data. Memory cards are cheap, it is only a matter time before that datapoint limitation is removed. Similarly battery capacity - just build a device with a bigger battery.