I can't understand why some people insist on delta transform as a filter or analog models as a submodel of a CM coder for compressing audio or bitmaps. I think, wavelet or related algorithms do fairly good job. For example:
Fazil Say-Uc Selvi.wav (58.967.036 bytes): 16-Bits Stereo, directly ripped from original CD as wav format. This means that it doesn't have quantized samples when we compare mp3->wav conversion.
ACWAV 22.358.653 bytes (~4 seconds)
PAQ8o8 (-7) 25.741.080 bytes (6744 seconds)
WinRAR (Best) 26.353.902 bytes (~15 seconds)
CMM4 v0.1f (76) 35.081.247 bytes (115 seconds)
BIT (LWCX) 35.471.999 bytes (115 seconds)
7-Zip (Ultra) 37.452.086 bytes (~31 seconds)
BALZ (ex) 40.356.654 bytes (143 seconds)
ACWAV compresses a wav file with a poor arithmetic coding after S+P transform. S+P transform actually a haar wavelet with simple predictive coding. So, it can be applied to any analog data e.g. bitmaps.
BIT LWCX mode is order 0-4, 6 CM coder based on neural network and SSE layer (512 MB memory was used for context hashes).
PAQ 8o8 processing time was very boring.
Also, note that on my laptop (Core2 Duo 2.2Ghz, 2GB RAM) file writing speed is around 20-25 MB/seconds. So, ACWAV's speed actually around 1 second!
Why don't you do some speed optimization on your LZ based compressor? Because, as you see in the timings some CM coders outperform your compressor in both time and final size. You may talk about decompression speed. But, if we consider total time (compression+transmission+decompression), we can compare CM coder with your BALZ. I think, you should definitely do some speed optimization.
Also, I noticed that your compressor only benefits on highly redundant file such as FP.log while my CM coder (BIT LWCX mode) benefits on semi-imcompressible files. Maybe this can be a pivot point for your optimizations.