The usual benchmarks report compression ratio and speed (and maybe decomp speed). I'm interested in the lightest, most efficient general-purpose codecs in terms of CPU and memory usage. I can't find any valid benchmarks for this – do you know of any?
Matt Mahoney's large text compression benchmark isn't rigorous enough to determine the most efficient codecs. The test hardware isn't held constant – one codec might be tested on a completely different CPU and OS than another codec, and with a different amount of RAM. This makes the results completely invalid and useless. And the benchmark doesn't report CPU usage. Moreover, many of the codecs are extremely old releases – e.g. the "gzip" is in the LTCB is an ancient 2007 version of gzip for Windows: http://gnuwin32.sourceforge.net/packages/gzip.htm
The gzip we need would be the latest version of zlib, which is what most web servers use. Which reminds me that one way of getting at the most efficient compressor would be to match zlib's CPU/memory efficiency while beating its compression ratio, (since zlib/gzip are about the lightest codecs we have right now in terms of memory and CPU). Or to use even less CPU/memory than zlib at the cost of worse compression ratios. Willy's SLZ seems to do this: https://encode.ru/threads/2575-SLZ-s...ble-compressor
Are there are any codecs that use less CPU and RAM than zlib but achieve better compression ratios? Even more awesome would be that it also be faster than zlib. (SLZ is much faster but doesn't compress as well.)
Thoughts? Codecs? For web servers it would be nice to be able to efficiently compress dynamic content on the fly, and Zstd ad brotli don't seem to suitable for that.