I realize most people here are interested in lossless codecs, but I have a question about lossless vs lossy decode for J2K.
I am finding that while a lossy encoded image may be half the size of a lossless encoded image, the time to decode lossy
is only about 10% lower (using freely available Kakadu kdu_expand).
One difference between lossless and lossy is that lossy uses a more computationally complex wavelet transform than lossless, and also needs
to do dequantization. However, the majority of the decode time should be spent in reverse entropy coding, so my intuition tells me that lossy
decode should be much faster than lossless.
Any insights here would be greatly appreciated.