Thanks for reminding about that type of images, I'd try experimenting with these after finishing the current projects -
it really seems to be popular in games and the like.
Simple recompression would be still simple though (not like precomp, which decodes all data, but like lzmarec) -
probably it would already work even if we'd just add a bitwise rangecoder and basic contextual stats (with value type
and previous value bits as context) to a format parser.
his is 4x faster than RC, right?
no, its just my more or less random estimation
or you mean 4x if you have a fixed table and that table is encoded
in the sourcecode?
you said "Sure, _static_ huffman decoding is faster than rangecoder
decoding, about 4x maybe, but its only true for static huffman and
simple optimized coders"
and his context-adaptive huffman is your static huffman, just like
say jpeg or zip?
i meant that its like that if we'd make a order0 byte coder or
something like that
but its much less certain with complex structured models
as i said, real adaptive huffman is always slower than arithmetic
(its simply more complicated) but static huffman is only faster
when it can be properly optimized
i mean, it can have lots more branches than rangecoder
so was I right in saying you thought he was saying he had 'adaptive
huffman' as in Vitter? and you answered in those terms?
no, i thought that it was likely static (adaptive huffman is very
hard to implement; its rare), but i was too lazy to check the
source, so mentioned adaptive huffman too
anyway, my main point is that choosing entropy coding type just
because its "fast" is wrong. and the right way imho is to use AC
first, and build a good model, then optimize the speed if necessary
agree, you make a compelling case
convinced me :)
it may be possible to skip the first step in well-researched areas,
like when making a standard LZ77 coder
but lossless image coding is clearly not one of these
because paq8 model only uses nearby points for prediction, and has
no colorspace or whatever picture-specific elements, but its still
very good (at least its compression ratio) when compared with other
lossless image codecs