What do you think about it?Originally Posted by encode
lossless is lossless!
It would be fine if Stuffit would give information about that "feature". But afaik it doesn't
What comes next? Every second pixel in a row vanishes? <uups! But don't bother, now your picture does even fit better on the screen!>
What happens to pictures that are compressed, decompressed, compressed again,...
Its not that the image will change... only the file thats holding itOriginally Posted by Vacon
but what if i computed for example md5 checksums and then sent them together with archive to a group of friends?
it should be made an option, like in infima - 'optimize source files'. users should have a choice.
imo, non bit- perfect archives shouldn't be included in benchmark results or size of difference file should be added.
I was waiting who will say that cursed name firstOriginally Posted by donkey7
I definitely agree!Originally Posted by donkey7
I think I'm not up to date with that problem. Was it in earlier StuffIt versions, too?
If anyone could send me example data (JPEGs, of course; before compression & after recompression), I could research that problem (why/how it happens).
I think there should be no change, after a JPEG file was recompressed once. If it is done again, there won't be any further change, right?
Btw. with packJPG it is also possible to test wether the pixel data is really unchanged between two JPEGs, regardless of file structure.
Not being "bit perfect" without carefully notifying the user is pretty severe.
Imagine an image inside a signed PDF. The SHA/MD5 fingerprints would no longer match.
I'd say Stuffit should be discarded from the test.
This is i kind of "feature" that should NOT be enabled as default.
It should be hidden under some advanced options as the default user should not have to use it.
Its a walking into a grey zone (meaning probably lsot in transition) to put this into a default mode.