I'm a little unclear on the matrix sync data. The third box... the longer one in the cell. The legend says its the relative freq offset of the receiver clocks in ppm. I checked my info and my figure is usually around -51 or so in most cells. The legend says if it's off to make sure the dump1090 ppm is correct. My confusion is that I thought the dump1090 ppm has to do with the accuracy of the receivers frequency calibration (in my case the rtl-sdr). Yet the legend text mentions receiver clocks, so I assume that is the timing used... in my situation the system time of my pc (I'm not using a PI). Guess I need clarification.
It has nothing to do with "clock" time, NTP or any other real-time. The ppm value is related to the fault of the crystal oscillator in the RTL dongle. That error affects both frequency calibration, but also the sampling intervals of the received signal, as there is only one source of these clock pulses. If you are -51 compared to most cells, you should add 51 to the current figure in dump, then you will be close to 0, at least relative to the other receivers. The best way is to check sync against a GPS-synhcronized receiver like the Radarcape, it will be correct to 0.02 ppm. /M
Having the exact ppm value isn't that important. Also it's impossible to know which receiver other users are running. Looking at your sync stats, you have an temperature compensate frequency source in your dongle anyway. No need to do anything. Didn't you see that you have less than 1 PPM difference to half other feeders? That means it's already sufficiently precise and the others with more difference are the ones without temperature controlled dongle frequencies.