My data are not actually totally random. I am looking to compress telemetry measurements which will tend to be in the same range (e.g. temperatures won't vary much). However, I seek a solution for multiple applications, so I might be sending temperatures one day, voltages the next, etc.
I want to send measurements over a low data rate satellite link. SatCom is reasonably expensive, so I would like to shave off every cent that I can. I don't mind expending computing resources to pack & unpack the data as nothing is too time critical (it can take up to 30 seconds to transmit 192 bytes).
Can anyone advise of a FOSS data compression method which will give me the most compression on telemetry data?
Is it ever worth trying? What sort of percentage gains can I expect?
I do apologize that I cannot be more precise about the nature of the data - just general telemetry measurements like temperatures, lat/long GPS position, flow rate of liquids, etc.
Truly random data is not compressible.
Since you can't disclose the details of your data, the best thing for you to do is to test a few different compression algorithms on some sample data.
A good place to start is the DEFLATE algorithm, which is the industry-standard combination of LZ77 sliding-window compression and Huffman coding. It is implemented by many specific compression packages, GZIP being a prominent example.