Select to expand quote
sailquik said..
Quote from Manfred: "Running the GW 52 at 1Hz to match the GT31 also makes no sense because of the missing antialiasing filters - the units have to run with the highest frequency possible (GW 52, Thingsee: 10Hz) to avoid this, because simple decimation of the data without filtering renders the data useless..."
"renders the data useless" is, at best, an
extreme overstatement. I would put it into the same category as "speed below 5 knots should not count towards distance". It's a personal opinion, which is based on some rational arguments, but it's not the one and only truth. Remember you are quoting a German engineer!
I'm pretty fed up with all the "our test show this and that and our rules are based on the results", without anyone ever bothering to make these data available. Setting up a WordPress or Google site to put the results of your tests takes just a few minutes.
Manfred has been a driving force behind going to higher data rates (5 Hz and 10 Hz). From the (mostly third-hand) talk I heard about this, I understand that the main reason to use higher data rates for measuring straight line speed (everything except alpha) is the reduction of random error through higher sampling rates.
Assuming that measurement error is largely random, measuring at 5 Hz could give about 2-fold higher accuracy than measuring at 1 Hz.
If the GW-52 units would have proper filters when running at 1 Hz, then the accuracy at 1 Hz
could be very close to the accuracy at 5 Hz. Simple averages would do most of the trick, Kalman filters (which are typically used in GPS units) would be even better. But I guess the comment about the "missing antialiasing filters" indicates that the GW-52 does not use such filters; instead, it may simply discard 4 of 5 data points, and log the 5th. This is the easiest way to implement 1 Hz recording on 5 Hz units, so there is a good chance it's done this way.
If we assume that the 1 Hz recording in the GW-52 is indeed implemented in this most simplistic way, the effect of the sub-sampling is that the error would increase by roughly 2-fold (square root of 5). Why anyone would call data with a 2-fold higher error margin "useless" is beyond me. Similarly, it is hard to imagine how the GW-52 can be significantly more accurate than the GT-31 or the Canmore at 5 Hz, but noticeably worse at 1 Hz. The only expected effect of using 1 Hz readings instead of 5 Hz reading would be that some teams might swap places in the monthly 2 second ranking (and perhaps in the alphas). The effect of running GT-31s in "power save" mode sure is a lot larger, and I did not see any warnings against that for years.
There are a few practical reasons to use 1 Hz recording, though. The 5 Hz data have a lot of high-frequency oscillations that don't really add to the analysis. In "action replay" mode in GPSAR Pro, you can't really follow the speeds at 5 Hz, the vary too quickly; nor does viewing tracks only around the current point work as well, since the region shown is 5x shorter. But more important is the limited data capacity on the GW-52 of only 129 thousand data points. That's 7.3 hours at 5 Hz, but 35 hours at 1 Hz. When I record at 1 Hz, and can recharge the unit between sessions from any USB charger or an external battery. When I record at 5 Hz, I
must hook the unit up to a computer running Windows between sessions, or I loose data - most of my sessions are longer than 4 hours. Depending on where I am, that can be pretty darn inconvenient.