Results 1 to 8 of 8

Thread: How lossless PNG optimization can break images – is there a knowledge base?

  1. #1
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    222
    Thanks
    89
    Thanked 46 Times in 30 Posts

    How lossless PNG optimization can break images – is there a knowledge base?

    Every once in a while, a PNG image that I put through the popular lossless optimization/reduction tools will no longer display in some clients. This is rare enough that I never dug into it or isolated the causes. Has anyone put together a knowledge base or some quick tips on how lossless PNG optimization can break image display (which might be bugs in the viewer, not the optimization tool)? I didn't find anything in Encode forums, and nothing pops up on Google (although this project is so fascinating).

    I'll collect more data, and try to isolate which tool+viewer combos have problems. But if there's already a knowledge base on the different lossless optimization techniques and how they interact with some viewers, I'd love to know about it.

    FYI, the info I have is limited, but here it is: Sometimes images won't display in the newish Microsoft Photos app on Windows 10 desktop. I often put images through FileOptimizer, which pipes together OptiPNG, pngwolf, AdvanceCOMP, etc, so it's hard to isolate the tool causing the issue (which might be MS Photos' fault, not the tool's). Maybe 1 out of 10, or fewer, images that go through this pipeline won't display in MS Photos. This could be an MS Photos bug – I'll try more clients to make sure. I've also seen issues in browsers, but I didn't collect the data back then.

    Tangent question: Does maximum lossless optimization/reduction complicate the decoder's job or reduce the decoding/decompression rate (MB/sec)? In other deflate applications, like gzip, maximum compression imposes no penalty on decompression rate, but I'm not sure about everything going on with PNG encoding/decoding.

  2. #2
    Member
    Join Date
    Jun 2016
    Location
    Earth, I think
    Posts
    13
    Thanks
    9
    Thanked 1 Time in 1 Post
    There is/was an optimizer (I do not recall exactly, maybe TruePNG? Not sure) that if while optimizing it would change the pixel format (e.g. from RGBA to Palette), it would not modify the CRC checksums accordingly.
    And so, programs who would test the CRC for data corruption (most programs do not) would not display the image as it would be detected that the image corrupt. Programs that ignore the CRC would display the image without problems, as the image data itself was fine.

  3. The Following User Says Thank You to pico For This Useful Post:

    SolidComp (24th June 2016)

  4. #3
    Member
    Join Date
    Jun 2015
    Location
    Switzerland
    Posts
    667
    Thanks
    204
    Thanked 241 Times in 146 Posts
    Quote Originally Posted by SolidComp View Post
    In other deflate applications, like gzip, maximum compression imposes no penalty on decompression rate, but I'm not sure about everything going on with PNG encoding/decoding.
    It is possible to send a PNG without filters with a palette (up to 256 colors). This means that each index byte is lz77 and entropy encoded. Typically no spatial filtering is used with palette images. A palette image is very fast to decode, possibly around twice as fast as a RGB image (my practical recent performance experience is from WebP, not PNG).

    It is possible that such a palette image becomes smaller when converted to RGB and is spatially predicted, and it would be perfectly fine for a maximum compression optimizer to do so. RGB is three times more entropy coding/LZ77, and filtering adds some more computation.

    I'd guess that this is rare to happen in practice.

  5. The Following User Says Thank You to Jyrki Alakuijala For This Useful Post:

    SolidComp (24th June 2016)

  6. #4
    Member
    Join Date
    May 2008
    Location
    France
    Posts
    78
    Thanks
    436
    Thanked 22 Times in 17 Posts
    Use pngcheck to verify corrupted files!

  7. The Following User Says Thank You to Mike For This Useful Post:

    SolidComp (24th June 2016)

  8. #5
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    222
    Thanks
    89
    Thanked 46 Times in 30 Posts
    There is/was an optimizer (I do not recall exactly, maybe TruePNG? Not sure) that if while optimizing it would change the pixel format (e.g. from RGBA to Palette), it would not modify the CRC checksums accordingly.
    And so, programs who would test the CRC for data corruption (most programs do not) would not display the image as it would be detected that the image corrupt. Programs that ignore the CRC would display the image without problems, as the image data itself was fine.
    Good to know, thanks. I'm going to research that a bit. It reminds me of the variance I've seen in SVG rendering in browsers lately, especially after using SVGO – some browsers would display nothing at all.
    Last edited by SolidComp; 24th June 2016 at 05:03. Reason: Forgot to quote

  9. #6
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    222
    Thanks
    89
    Thanked 46 Times in 30 Posts
    Quote Originally Posted by Jyrki Alakuijala View Post
    It is possible to send a PNG without filters with a palette (up to 256 colors). This means that each index byte is lz77 and entropy encoded. Typically no spatial filtering is used with palette images. A palette image is very fast to decode, possibly around twice as fast as a RGB image (my practical recent performance experience is from WebP, not PNG).

    It is possible that such a palette image becomes smaller when converted to RGB and is spatially predicted, and it would be perfectly fine for a maximum compression optimizer to do so. RGB is three times more entropy coding/LZ77, and filtering adds some more computation.

    I'd guess that this is rare to happen in practice.
    That's so fascinating! And it's exactly the kind of quirk I expect to find down in the weeds of encoding/decoding. I just learned a few months ago that there's an appreciable difference between JPEG and PNG decoding speed for most apps, that JPEG is faster. (The more I learn about JPEG, the more impressed I am – it was such a great achievement in the 1990s, and still stands out.) Webp is slower than PNG, apparently, and I hope you guys can do more about that in future releases. There needs to be a way to SIMD or GPU/OpenCL it the way there is with JPEG and JPEG-XR. (Well, I was reading Colt's post about how GPU compression formats are much better than PNG, so I'm confused why we don't just migrate to GPU formats.)

  10. #7
    Member SolidComp's Avatar
    Join Date
    Jun 2015
    Location
    USA
    Posts
    222
    Thanks
    89
    Thanked 46 Times in 30 Posts
    Quote Originally Posted by Mike View Post
    Use pngcheck to verify corrupted files!
    Thanks! I'd never heard of it. That old-school libpng website is full of little treasures...

  11. #8
    Member
    Join Date
    Jun 2015
    Location
    Switzerland
    Posts
    667
    Thanks
    204
    Thanked 241 Times in 146 Posts
    Quote Originally Posted by SolidComp View Post
    Webp is slower than PNG
    WebP lossy is definitely slower to decode than PNG and JPEG. Old school arithmetic coding is just slower than Huffman.

    However, on purely technical grounds, WebP lossless should be faster than PNG. Much of the processing in WebP lossless, including the LZ77, is based on uint32_t, not uint8_t. This gives quite a significant speed boost. Also, the most favorable filter in PNG/WebP (Paeth in PNG/Select in WebP) has 4 branches per RGBA pixel in PNG, and only one branch in WebP. Smaller file size should give less entropy coding cost.

    The fastest somewhat lossy decoding for WebP is the delta palettization mode. However, that mode is just a hack, and cannot support the alpha channel. Sometimes that can be still acceptable. Also, all near-lossless settings will decode faster than pure lossless.

    I fully agree with you that JPEG was a huge achievement, it combines knowledge efficiently from many different areas. We are next hoping to do the JPEG what we did to png and gzip with ZopfliPNG and Zopfli. We will soon, hopefully within the next six months, launch a butteraugli-based JPEG encoder, that will find the smallest image with butteraugli score smaller than 1.0. The butteraugli-encoded JPEGs will likely be 25+ % smaller and decode 25+ % faster than similar butteraugli-scored libjpeg encoded images. Butteraugli targets at relatively high quality images (libjpeg 92-97).
    Last edited by Jyrki Alakuijala; 24th June 2016 at 16:16.

  12. The Following User Says Thank You to Jyrki Alakuijala For This Useful Post:

    SolidComp (25th June 2016)

Similar Threads

  1. GIF optimization
    By lorents17 in forum Data Compression
    Replies: 9
    Last Post: 27th July 2013, 19:57
  2. C++ optimization guide
    By Bulat Ziganshin in forum The Off-Topic Lounge
    Replies: 4
    Last Post: 25th May 2012, 12:23
  3. Comparison of lossless PNG compression tools
    By Surfer in forum Data Compression
    Replies: 54
    Last Post: 19th September 2011, 22:58
  4. New lossless compressor for 24-bit images (3 channels, 8 bits per channel)
    By Alexander Rhatushnyak in forum Data Compression
    Replies: 28
    Last Post: 23rd September 2010, 01:43
  5. Compression idea: Base conversion
    By Nightgunner5 in forum Data Compression
    Replies: 8
    Last Post: 30th October 2009, 07:58

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •