Page 1 of 2 12 LastLast
Results 1 to 30 of 39

Thread: WEBP - how to improve it?

  1. #1
    Tester
    Stephan Busch's Avatar
    Join Date
    May 2008
    Location
    Bremen, Germany
    Posts
    872
    Thanks
    457
    Thanked 175 Times in 85 Posts

    Question WEBP - how to improve it?

    While WEBP is developed to make web-browsing of image intense sites faster, it is interesting
    because it introduces a new open-sourced and free lossless compression that outperforms
    .PNG and animated .GIF and it may be capable to replace both formats one day.
    This is our chance for a free image format in such a way FLAC is free for lossless audio.

    On my tests, WEBP 0.3.1 compresses 24-bit RGB like JPEG2000 and on palette images it
    is worse than PNG.

    WEBP also is much slower than JPEG, GIF and PNG in encoding and decoding
    and it compresses not good enough to my taste. While no one would donate them his own
    lossless image codec to just become famous, I want to ask our professionals here:
    What would you change to make WEBP faster or improve lossless compression?

  2. #2
    Member Alexander Rhatushnyak's Avatar
    Join Date
    Oct 2007
    Location
    Canada
    Posts
    232
    Thanks
    38
    Thanked 80 Times in 43 Posts
    Quote Originally Posted by Stephan Busch View Post
    WEBP also is much slower than JPEG, GIF and PNG in encoding and decoding
    If you mean regular lossless Jpeg (and there's no 'lossy' in your message) then WEBP decodes ~27% slower (in cases when speed is noticeable: big photos) Not that much, I would say. But if you mean Jpeg2k, then WEBP decodes ~2.5 times faster than a highly optimized commercial Jpeg2k implementation: http://imagecompression.info/gralic/LPCB.html

    While no one would donate them his own lossless image codec
    Do you think googlers would appreciate such donation?

    This newsgroup is dedicated to image compression:
    http://linkedin.com/groups/Image-Compression-3363256

  3. #3
    Member
    Join Date
    Jun 2013
    Location
    USA
    Posts
    98
    Thanks
    4
    Thanked 14 Times in 12 Posts
    Quote Originally Posted by Stephan Busch View Post
    WEBP also is much slower than JPEG, GIF and PNG in encoding and decoding
    and it compresses not good enough to my taste.
    I'm not a professional or expert, but slower compared to what? pngout and zopflipng are extremely slow for example. same with jpegrescan for jpegs. There are also JPEGs generated using libjpeg and libjpeg-turbo(SSE variant) which have significant performance differences.

    There may be speed to be gained by using SIMD which I did not see last time I looked at the WebP source code. I could be wrong though.

    Do you think googlers would appreciate such donation?


    Not an expert, but I doubt it. They seem to be really stuck on using LZ77 for everything lossless(SPDY, zopflli, WebP lossless, maybe other stuff...)

  4. #4
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Quote Originally Posted by Alexander Rhatushnyak View Post
    If you mean regular lossless Jpeg (and there's no 'lossy' in your message) then WEBP decodes ~27% slower (in cases when speed is noticeable: big photos) Not that much, I would say.
    Lossless JPEG (predictive scanline based, that is, i.e. 10918-1) in which implementation? Note well that my implementation is not particularly fast (and not meant to be fast in first place), one can do a lot better than that.
    Quote Originally Posted by Alexander Rhatushnyak View Post
    But if you mean Jpeg2k, then WEBP decodes ~2.5 times faster than a highly optimized commercial Jpeg2k implementation
    May I ask what this implementation is? Because, as far as JPEG 2000 is concerned, we've here a really good one and that's not really that much slower than JPEG. It's not the "ten times slower" people like to claim. However, what can be said is that for JPEG 2000 the speed is much more "image quality" dependent than for JPEG. Anyhow, I don't think google actually cares. I neither do see the point with webp. It's not that it is much better than JPEG, nor much faster, nor more features. If you need the features, go JPEG 2000. If you need just an image, in most cases for web applications, JPEG does it ok. If you need HDR photography, wait for XT.

  5. #5
    Tester
    Stephan Busch's Avatar
    Join Date
    May 2008
    Location
    Bremen, Germany
    Posts
    872
    Thanks
    457
    Thanked 175 Times in 85 Posts
    I will add encoding timings to my lossless image compression benchmark.
    And I believe they will at least read what we write here and maybe they will appreciate such donations.

    At least, WEBP is an alternative to .GIF and .PNG and it does also support lossless and lossy animations.
    It is an open and free format and if its development goes like that of Android, then it might be a good choice
    one day and could compete with other formats. It could become for images what FLAC is for lossless audio
    and maybe it can become an open and free replacement for Adobe DNG and .TIFF.

  6. #6
    Member
    Join Date
    Feb 2013
    Location
    Earth
    Posts
    2
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Alexander Rhatushnyak View Post
    If you mean regular lossless Jpeg (and there's no 'lossy' in your message) then WEBP decodes ~27% slower (in cases when speed is noticeable: big photos) Not that much, I would say. But if you mean Jpeg2k, then WEBP decodes ~2.5 times faster than a highly optimized commercial Jpeg2k implementation: http://imagecompression.info/gralic/LPCB.html?
    Also, those tests were run on an old version of WebP. I've no idea if there's a performance difference between 0.2.0 and 0.3.1 (current), but since the last official release, there's been a large boost in lossless decoding.

  7. #7
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    85
    Thanks
    8
    Thanked 3 Times in 3 Posts
    WEBP is not very good. It will not win the web over. I am waiting for H.265 based image codec. With a bit of luck there will be a common standard and at least one open source implementation.

  8. #8
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by jethro View Post
    WEBP is not very good. It will not win the web over. I am waiting for H.265 based image codec. With a bit of luck there will be a common standard and at least one open source implementation.
    I doubt webp will win the web over given how entrenched jpeg is (png hasn't even wiped out gif yet), but a h265 based image format is dead upon arrival as a replacement given that it will be heavily patent / royalty encumbered.

    And while webp probably haven't enough of an advantage over jpeg for it to ever replace it, it IS better than jpeg and it's royalty free. Also the lossless mode compares very favourably against png on most of my tests.

    Then we have VP9 which might be used for the next iteration of webp, which should bring some further improvements (although obviously most of the improvements in VP9 over VP8 are directly targeted at video).

  9. #9
    Tester
    Stephan Busch's Avatar
    Join Date
    May 2008
    Location
    Bremen, Germany
    Posts
    872
    Thanks
    457
    Thanked 175 Times in 85 Posts
    I agree that no H.264 or H.265 derivative will replace any picture format because of licensing issues.
    And I really like WEBP because its better than PNG and JPEG on lossless parts and far better than GIF (also on animations).
    It is also better than JXR. I don't care about lossy performance, but I guess its very hard to replace JPEG (every codec that
    attempted to replace JPEG is already gone or in the middle of going).

    If they decide not to change from VP8 to VP9 and if they won't replace Color-Transform+LZ77+Huff with fast CM,
    they might take a look at open and royalty-free H.265 variants such as DAALA (https://xiph.org/daala/)
    And for me it's interesting to see it gets better and better with each build.
    The updated encoding timings are available at http://www.squeezechart.com

  10. #10
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    85
    Thanks
    8
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Stephan Busch View Post
    I agree that no H.264 or H.265 derivative will replace any picture format because of licensing issues.
    And I really like WEBP because its better than PNG and JPEG on lossless parts and far better than GIF (also on animations).
    It is also better than JXR. I don't care about lossy performance, but I guess its very hard to replace JPEG (every codec that
    attempted to replace JPEG is already gone or in the middle of going).

    If they decide not to change from VP8 to VP9 and if they won't replace Color-Transform+LZ77+Huff with fast CM,
    they might take a look at open and royalty-free H.265 variants such as DAALA (https://xiph.org/daala/)
    And for me it's interesting to see it gets better and better with each build.
    The updated encoding timings are available at http://www.squeezechart.com
    Lossless on web? I am not sure, even if for some page elements I think everyone will stick to gif/png. For raw camera images webp saves only 8 bits so it won't be lossless. H.264 is free to use, AFAIK even google does not pay a cent anywhere for all the H.264 encoding on youtube. I think I read that H.265 would be free to use too. And H.265 is/will be great codec. VP9 is super slow and it will take a couple of years for it to improve speed. Plus it is not really better than x264 now despite being 100x slower. Daala is like 5 years away from being truly useful, probably more knowing how long it takes to make good codec implementation.
    Also http://xhevc.com/en/imageCompression...ompression.jsp
    Last edited by jethro; 31st July 2013 at 20:11.

  11. #11
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by jethro View Post
    H.264 is free to use, AFAIK even google does not pay a cent anywhere for all the H.264 encoding on youtube.
    Of course they pay, only if you do not make any money from a site which serves h264 video are you (currently) allowed to do so for free. Google makes money from Youtube, as do any other video provider site out there with even an ad banner. Why do you think Google spend so much money developing their own video codec to begin with?

    Quote Originally Posted by jethro View Post
    I think I read that H.265 would be free to use too.
    You are deluded, h264 isn't free and neither will h265 be, it's not even covered by the 'free unless you make any money' promise as of yet.

    Quote Originally Posted by jethro View Post
    VP9 is super slow and it will take a couple of years for it to improve speed.
    Yes the encoder is currently very slow, same goes for the h265 encoders out now, it will not 'take a couple of years to improve speed' though, optimization was pointless until the bitstream had been finalized, which it has now and optimizations are underway.

    Quote Originally Posted by jethro View Post
    Plus it is not really better than x264 now despite being 100x slower.
    It's already better than x264 for HD content in many tests, you seem utterly clueless:

    http://forum.doom9.org/showthread.ph...37#post1636137

    And it has been improved further since this test was done.

    Again, a h265 based image format could never replace jpeg on the web, in order to replace jpeg you have to have something everybody can use without paying royalties, and unless MPEGLA gives perpetual royalty free rights to use h265 technology for image compression (not likely!) there's no way it will be used online. Currently this mainly leaves webp, which is better than jpeg and png in most tests I've done, but even so it would likely have to be much better still in order to gain traction as jpeg is simply 'good enough' for people and works everywhere in anything.

  12. The Following User Says Thank You to binarysoup For This Useful Post:

    Black_Fox (31st July 2013)

  13. #12
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    85
    Thanks
    8
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by binarysoup View Post
    Of course they pay, only if you do not make any money from a site which serves h264 video are you (currently) allowed to do so for free. Google makes money from Youtube, as do any other video provider site out there with even an ad banner. Why do you think Google spend so much money developing their own video codec to begin with?


    You are deluded, h264 isn't free and neither will h265 be, it's not even covered by the 'free unless you make any money' promise as of yet.


    Yes the encoder is currently very slow, same goes for the h265 encoders out now, it will not 'take a couple of years to improve speed' though, optimization was pointless until the bitstream had been finalized, which it has now and optimizations are underway.


    It's already better than x264 for HD content in many tests, you seem utterly clueless:

    http://forum.doom9.org/showthread.ph...37#post1636137

    And it has been improved further since this test was done.

    Again, a h265 based image format could never replace jpeg on the web, in order to replace jpeg you have to have something everybody can use without paying royalties, and unless MPEGLA gives perpetual royalty free rights to use h265 technology for image compression (not likely!) there's no way it will be used online. Currently this mainly leaves webp, which is better than jpeg and png in most tests I've done, but even so it would likely have to be much better still in order to gain traction as jpeg is simply 'good enough' for people and works everywhere in anything.
    Don't call me clueless if you are not so amazing with understanding the matter yourself. And I saw that game test on doom9 too, and others. I also made my own little test. I think h.265 makes prefect sense for still images. It is similar in looks to DLI image codec which is best IHMO as of yet. The site I linked has decently fast hevc encoder, already capable of encoding 1080p video over 1fps on very slow settings. What you don't get is Google can't force other, often competing companies to support their open source codecs. HEVC is industry standard. It will be everywhere. It is basicaly the next big thing in video. VP9 is far from ready and VP8 never really was.
    Last edited by jethro; 31st July 2013 at 23:56.

  14. #13
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Which web browser and image viewer on Windows 7 SP1 and on Ubuntu 12.04.2 currently supports which of these?
    - H.265-encoded images
    - H.265-encoded video
    - VP9-encoded images
    - VP9-encoded video

    Sorry if I generalize a bit and the format names are incorrect, I'm no expert, only the target of the industry. I can see that wars over effectivity of the formats have already begun, so I wonder if any user can actually benefit right now or all these arguments are rather academical.

    Google can't force other, often competing companies to support their open source codecs. HEVC is industry standard. It will be everywhere.
    Forcing a licensed technology as an industry standard is not a best practice. See Rambus in DDR area.
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  15. #14
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    85
    Thanks
    8
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Black_Fox View Post
    Which web browser and image viewer on Windows 7 SP1 and on Ubuntu 12.04.2 currently supports which of these?
    - H.265-encoded images
    - H.265-encoded video
    - VP9-encoded images
    - VP9-encoded video
    Unless one counts custom encoder-decoder pair the situation is like this, all this AFAIK of course
    1 none (there is not even a standard for it yet)
    2 none (you can play video in a player like MPC, though - Win only)
    3 none (same as 1. but Google could take WEBP format)
    4 Chrome 29 (but support is still a bit buggy)
    Last edited by jethro; 1st August 2013 at 00:59.

  16. The Following User Says Thank You to jethro For This Useful Post:

    Black_Fox (1st August 2013)

  17. #15
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by jethro View Post
    I think h.265 makes prefect sense for still images.
    Oh I'm sure it could make a great basis for an image compression, vp8 showed that a video codec could function very well as the basis for image compression with webp. I was saying that while webp is free and better than jpeg it still has a very small chance of actually replacing jpeg on the web, meanwhile a h.265 based image codec which you proposed has zero chance at all of replacing jpeg on the web due to it being patent/royalty encumbered.

    Quote Originally Posted by jethro View Post
    What you don't get is Google can't force other, often competing companies to support their open source codecs.
    What does this have to do with anything we discussed?

    Quote Originally Posted by jethro View Post
    HEVC is industry standard. It will be everywhere. It is basicaly the next big thing in video.
    What does this have to do with h.265 as an possible image codec and it's chances of replacing jpeg on the web? Again, sites won't switch from jpeg to a image format for which they will have to pay money for (probably not even for a format which is free), web browsers won't add support for a image format they have to pay money for, etc.

    Quote Originally Posted by jethro View Post
    VP9 is far from ready and VP8 never really was.
    VP9 specfication is finalized, and just like with h.265 (which is also finalized) it's now about working on the implementations so that they deliver the best quality the specification can muster. This will take time both for VP9 and h.265, as we've seen by simply comparing the first h.264 implementations against the current king of the hill, x264. But even at the current early stage both VP9 and h.265 shows great promise.

    Unfortunately it seems you are one of them crazy fanboys who 'identify with a video codec' so I'm probably wasting my time trying to have an objective discussion. For the rest of us, VP9 and h.265 are strong new video codecs and it will be very interesting to follow the quality their respective implementations will deliver now that the optimization and fine tuning stage has begun.

  18. #16
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,471
    Thanks
    26
    Thanked 120 Times in 94 Posts
    When HTML5, WebGL and WebCL will be widely available, what would be a reason not to use gazillions of proprietary formats that gets decoded and displayed on client browser using WebCL and WebGL? Just decode an image using WebCL and then forward it to WebGL as a texture which gets displayed on HTML5 Canvas and voila! (or something like that, I'm no HTML expert)

  19. #17
    Programmer osmanturan's Avatar
    Join Date
    May 2008
    Location
    Mersin, Turkiye
    Posts
    651
    Thanks
    0
    Thanked 0 Times in 0 Posts
    @Piotr:
    You don't need such high requirements to just decode WebP images
    JavaScript WebP Decoder: http://webpjs.appspot.com/

    But, WebM is different for being you have to draw each frame (requires Canvas element which is a HTML5 specific element):
    JavaScript WebM Decoder: http://badassjs.com/post/17218459521...script-for-all
    BIT Archiver homepage: www.osmanturan.com

  20. #18
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,471
    Thanks
    26
    Thanked 120 Times in 94 Posts
    Osman:
    Yeah, it's already available, but it is prohibitively inefficient. Desktop PC's are in decline and most people browse Internet from laptops I think. (Most recently also tables are getting popular. Phones don't need anything above JPEG for images IMHO as nobody is going to detect artifacts at such small screens :]) For bigger displays/ images there is more work to do so Javascript versions would both shorten battery life (applies only to unplugged devices) and increase heat and noise output. WebCL should improve the efficiency to the level where it's actually feasible to decode lots of custom-format images.

  21. #19
    Member
    Join Date
    May 2007
    Location
    Poland
    Posts
    85
    Thanks
    8
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by binarysoup View Post
    Oh I'm sure it could make a great basis for an image compression, vp8 showed that a video codec could function very well as the basis for image compression with webp. I was saying that while webp is free and better than jpeg it still has a very small chance of actually replacing jpeg on the web, meanwhile a h.265 based image codec which you proposed has zero chance at all of replacing jpeg on the web due to it being patent/royalty encumbered.
    Did you use UCI (Ultra Compact Image) by Dwing? It is x264 based image encoding. It was first made before WebP was announced. (Much) better quality than WebP (the lossy part). I can link it here if you want to try and can't find it.
    Quote Originally Posted by binarysoup View Post
    What does this have to do with anything we discussed?
    Would you like to open your WebP images easily on your iphone? Would a web graphic designer care much about WebP if his professional $$$ software does not have WebP format in the 'Save as..' list?

    Quote Originally Posted by binarysoup View Post
    What does this have to do with h.265 as an possible image codec and it's chances of replacing jpeg on the web? Again, sites won't switch from jpeg to a image format for which they will have to pay money for (probably not even for a format which is free), web browsers won't add support for a image format they have to pay money for, etc.
    A lot. Think VP8=>WebP. IF there is HEVC image format in the future, there will be free decoding for sure. Encoding most likely too (as x264 is free as in beer for users). So I don't understand your 'pay money' conviction. A propos money, you were right, Google does pay for youtube but it is around 6,5 mln $ annually which for them is like nothing (percent of a percent of profits). Also they likely bought On2 Tech with VP8 as a backup codec, just in case. The VP8 team was very small, couple of programmers at a time.

    Quote Originally Posted by binarysoup View Post
    VP9 specfication is finalized, and just like with h.265 (which is also finalized) it's now about working on the implementations so that they deliver the best quality the specification can muster. This will take time both for VP9 and h.265, as we've seen by simply comparing the first h.264 implementations against the current king of the hill, x264. But even at the current early stage both VP9 and h.265 shows great promise.
    Well you could think like that. However note that HEVC is better subjectively and objectively than VP9, even Google admits that they did not achieve parity. And HEVC will stay ahead of VP9 due to better design (call that my uneducated opinion).

    Quote Originally Posted by binarysoup View Post
    Unfortunately it seems you are one of them crazy fanboys who 'identify with a video codec' so I'm probably wasting my time trying to have an objective discussion. For the rest of us, VP9 and h.265 are strong new video codecs and it will be very interesting to follow the quality their respective implementations will deliver now that the optimization and fine tuning stage has begun.
    lol, who is objective. You link to one test, with very specific content, which is actually the best case for VP9 vs x264. I am disheartened with Google, you could say that about me. They sit on VP8 for so long and it still was barely usable. I even made my own VP8 builds but hardly anything changed, both feature and quality-wise. VP9 might produce better quality than x264 in the future, especially with some specific (high motion?) content but unless Google starts to be serious, VP9 will be always slow, hard to use, and unsupported.
    Last edited by jethro; 1st August 2013 at 23:18.

  22. #20
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by jethro View Post
    Did you use UCI (Ultra Compact Image) by Dwing? It is x264 based image encoding.
    I read about that MANY years ago and haven't heard anything about it since until you brought it up (is it even in development any longer) which just goes to prove my case, better quality isn't enough to make a splash in the image codec space.


    Quote Originally Posted by jethro View Post
    Would you like to open your WebP images easily on your iphone? Would a web graphic designer care much about WebP if his professional $$$ software does not have WebP format in the 'Save as..' list?
    Again you are making my point for me, like I've been saying there's no real tangible interest to replace jpeg as the standard for lossy image encoding. Jpeg is 'good enough' in terms of storage size versus quality, there's no 'pent up demand' for a new format as jpeg is supported everywhere by everything and any savings made by new formats are very small in today's typical storage capacities, end users just won't see it justify the hassle of a 'new format'.

    Web providers on the other hand might, and we've seen attempts to leverage bandwidth costs by using a more effective format but users aren't happy (wtf is this webp?) as seen with the facebook debacle when they started pushing webp.

    And the ONLY reason facebook gave webp a shot was because it was FREE, there's no chance in hell they would have tried this had they been forced to pay royalties.

    As I said, only a free image format like webp would have any chance of replacing jpeg, but even that is very unlikely to be enough.

    Quote Originally Posted by jethro View Post
    there will be free decoding for sure.
    Why would there be 'free decoding for sure'? A h.265 based image codec spec will most certainly not come from the MPEGLA members as they've never shown no interest in an image codec (no money in it), and MPEGLA is the only entity which could make it 'free for decoding'.

    Quote Originally Posted by jethro View Post
    A propos money, you were right, Google does pay for youtube but it is around 6,5 mln $ annually which for them is like nothing (percent of a percent of profits).
    Doesn't matter what Google pays 'now' (if those numbers are even remotely correct, source?), if MPEGLA manages to monopolise the entire video codec market (which they were practically doing before VP8 and later VP9 emerged) they can set the prices as they wish.

    And since 'online video' is the entire future of video, with practically every device being capable of playing and even recording video. As such it makes perfect sense for a large entity like Google to want their own video codec, not only to minimize the cost of licencing other codecs in the long run, but also have a codec which they can develop in the direction best suited for their products (real-time obviously being a priority).

    Quote Originally Posted by jethro View Post
    Also they likely bought On2 Tech with VP8 as a backup codec, just in case. The VP8 team was very small, couple of programmers at a time.
    I have no knowledge of the VP8 team setup (can you point me to a reference) but the VP9 team isn't particularly huge judging by the commit stats (don't know what you're comparing too, again any references?).

    But that's not what is important, the x264 core devs weren't a huge group of people, but they were very good at what they did, and for example Ronald Bultje of x264 is now working on VP9 as employed by Google.

    Quote Originally Posted by jethro View Post
    And HEVC will stay ahead of VP9 due to better design (call that my uneducated opinion).
    I fully expect h.265 to end up better than VP9 when the implementations mature aswell (due to the broad MPEGLA patent pool), however I don't expect the differences to be 'dramatic', and for commercial use I can see myself choose VP9 over h.265 as VP9 is free without any royalty cost.

    Meanwhile comparing h.265 and VP9 now is too early for a 'judgement call' on the respective quality they can offer. Like with h.264, it will take a lot of tuning to get the most out of each specification, it will be years before we get to see the 'real' capacity of these codecs. Interesting times ahead!

    Quote Originally Posted by jethro View Post
    lol, who is objective. You link to one test, with very specific content, which is actually the best case for VP9 vs x264.
    I don't know if this is the best case of VP9 vs x264 (do you have anything to back this up?), but you tried to pass off VP9 as inferior to h.264 and I provided an example of VP9 being superior to h.264, that doesn't mean it wins in all tests.

    However I fully expect that VP9 will end up beating h.264 in all full HD and above resolution uses once it matures, with larger gains as the resolutions increase, that is to be expected though as VP9 (like h.265) is directly aimed at such high resolutions.

    Quote Originally Posted by jethro View Post
    They sit on VP8 for so long and it still was barely usable.
    Barely useable? I have compared quite a few Youtube videos which were encoded in VP8 and h.264 and you'd be hard to tell any difference when watching.

    Yes, I'm fully aware that youtube quality is nowhere near what h.264 can offer, but it is very representative of the quality you'd expect from streaming videos on the web, which is after all what both VP8 and VP9 is aimed at.

    So your 'barely usuable' nonsense comes across as pure bs. You don't have to be able to beat the best available to be 'useable'.

    Quote Originally Posted by jethro View Post
    unless Google starts to be serious, VP9 will be always slow, hard to use, and unsupported.
    VP9 is obviously 'serious', in fact their whole webm effort has obviously been serious.

    'Slow' is a matter of optimization, which is what VP9 encoder/decoder is going through right now (as the specification is now finalized), just as the current crop of h.265 encoders aren't anywhere near representative of the speed we'll get once the implementations have been heavily optimized.

    Of course both h.265 and VP9 will still end up slower than h.264 given that they use more demanding video compression algorithms.

    Hard to use? How is it hard to use? Compared to what?

    As for support, I assume you mean hardware, VP8 had decent hardware support (basically Android devices, and there are a 'couple' of those) and given that it doesn't cost anything (licence) to implement VP8 or VP9 hardware support I can't see any reason for it not to keep increasing.

    If you are a chip maker, adding VP8/VP9 support just helps make your chips more attractive at no licence cost whatsoever. And since you likely want our chips to be viable for use in Android devices (and other things Google is cooking up, like Glass etc), it seems like a no-brainer.

    At the end of the day, no one is forcing you to use VP9, and I'm not certain I'll end up using it for my personal needs, I am however very thankful for the competition it offers which will only help increase the pace of codec development.

    And of course I'm very thankful for the way it's existance prevents MPEGLA from 'owning the market' and how it forces them to 'play nice' in terms of licence pricing.

  23. #21
    Member
    Join Date
    May 2008
    Location
    France
    Posts
    78
    Thanks
    436
    Thanked 22 Times in 17 Posts
    Quote Originally Posted by binarysoup View Post
    I read about that MANY years ago and haven't heard anything about it since until you brought it up (is it even in development any longer) which just goes to prove my case, better quality isn't enough to make a splash in the image codec space.
    http://tieba.baidu.com/p/2333527735
    http://rghost.net/private/47840554/5...176d410b75e930
    http://rghost.net/private/47840473/2...6b37c4f79cefea

  24. The Following User Says Thank You to Mike For This Useful Post:

    Jaff (10th August 2013)

  25. #22
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Quote Originally Posted by binarysoup View Post
    Oh I'm sure it could make a great basis for an image compression, vp8 showed that a video codec could function very well as the basis for image compression with webp. I was saying that while webp is free and better than jpeg it still has a very small chance of actually replacing jpeg on the web, meanwhile a h.265 based image codec which you proposed has zero chance at all of replacing jpeg on the web due to it being patent/royalty encumbered.
    That depends on the policy of the MPEG/LA. If they decided to release HEVC still image profile under NRAND conditions, where is the problem?
    Quote Originally Posted by binarysoup View Post
    What does this have to do with h.265 as an possible image codec and it's chances of replacing jpeg on the web? Again, sites won't switch from jpeg to a image format for which they will have to pay money for (probably not even for a format which is free), web browsers won't add support for a image format they have to pay money for, etc.
    Acutally, please note that MPEG/JPEG have a joint committee where we try to do exactly that, i.e. propose a profile of HEVC for still image compression. Of course, it's all politics, and I consider backwards compatibility a major issue that is not addressed by this idea, but on the other hand, chips with HEVC on board will become available for cheap money. VPx and WebP, well, I'm not really sure what this is all about. The codecs are driven by "patent prevention", and that's a bad sign. At the same time, nobody really knows *for sure* whether they are royalty free... I believe Nokia is still in court with google on this.
    Quote Originally Posted by binarysoup View Post
    Unfortunately it seems you are one of them crazy fanboys who 'identify with a video codec' so I'm probably wasting my time trying to have an objective discussion. For the rest of us, VP9 and h.265 are strong new video codecs and it will be very interesting to follow the quality their respective implementations will deliver now that the optimization and fine tuning stage has begun.
    Well, defining "quality" is not such an easy thing. It all depends on the envisioned workflow from image accquisition to reproduction. Video quality is something different than still image quality, lower quality on video is usually accepted, but not for still images. If you mind, I'm doing here a lot of quality tests for JPEG. HEVC isn't that bad, actually - it seems better than JPEG 2000 - though WebP has some problems at the "high end" quality.

  26. #23
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by thorfdbg View Post
    That depends on the policy of the MPEG/LA. If they decided to release HEVC still image profile under NRAND conditions, where is the problem?
    I assume that you meant FRAND (fair, reasonable, and non-discriminatory terms), the problem is simple, you still have to pay royalties under those terms, for everywhere you want to support that image format. I've already cited the reasons why I don't see a chance of this happening so I won't repeat them.

    Quote Originally Posted by thorfdbg View Post
    Acutally, please note that MPEG/JPEG have a joint committee where we try to do exactly that, i.e. propose a profile of HEVC for still image compression.
    Are you really trying to push another patent and royalty encumbered image codec? Have you guys learnt nothing from the JPEG 2000 failure to replace Jpeg? Is this a real life example of doing the exact same thing over and over again while expecting a different result? That is unless you are just creating this as a niche format targeted at specific end user targets like photographers?

    Quote Originally Posted by thorfdbg View Post
    The codecs are driven by "patent prevention", and that's a bad sign.
    What do you mean by this statement?

    Quote Originally Posted by thorfdbg View Post
    At the same time, nobody really knows *for sure* whether they are royalty free... I believe Nokia is still in court with google on this.
    Nokia is going to court on web video patents (unless there is a settlement of course), their patents have nothing to do with images over the web. Of course there's no certainty that HEVC or any HEVC based image codec isn't infringing on any patent, particularly given the overalll broadness and insane amount of software patents there are out there. In short, anyone can be sued at any time, such is the 'wonderful' world of software patents.

    Quote Originally Posted by thorfdbg View Post
    If you mind, I'm doing here a lot of quality tests for JPEG. HEVC isn't that bad, actually - it seems better than JPEG 2000 - though WebP has some problems at the "high end" quality.
    I'd like to check out those tests, sounds interesting. Do you have a thread here?

  27. #24
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Quote Originally Posted by binarysoup View Post
    I assume that you meant FRAND (fair, reasonable, and non-discriminatory terms), the problem is simple, you still have to pay royalties under those terms, for everywhere you want to support that image format. I've already cited the reasons why I don't see a chance of this happening so I won't repeat them.
    No, I really mean free, not fair. MPEG can be used free of charge for web applications, that was a huge discussion.
    Quote Originally Posted by binarysoup View Post
    Are you really trying to push another patent and royalty encumbered image codec? Have you guys learnt nothing from the JPEG 2000 failure to replace Jpeg? Is this a real life example of doing the exact same thing over and over again while expecting a different result? That is unless you are just creating this as a niche format targeted at specific end user targets like photographers?
    Look, first of all, it is not "me" who is trying something, but first of all: JPEG 2000 available for royalty free licenses, actually under exactly the same conditions JPEG baseline came. JPEG was never "licence free", actually nothing of ISO is. But both formats have a baseline that is available without charge. As already stated, the nice thing about HEVC as still image codec would be that chips (hardware!) would be there. Whether that is sufficient as a critical mass I do not know. Do you call photography plus web a niche market? (Though, actually, that is not the market put forward by the supporters, but photography would also work nicely).
    Quote Originally Posted by binarysoup View Post
    What do you mean by this statement?
    WebM avoids a lot of useful and proven technology, and instead replaces that with sub-optimal non-ideal technology just to work around patents. B-Frames just to name one. Actually, if anyone would that within ISO, this would be a clear case for anti-trust and might have consequences. Google does not want to standardize.
    Quote Originally Posted by binarysoup View Post
    Nokia is going to court on web video patents (unless there is a settlement of course), their patents have nothing to do with images over the web. Of course there's no certainty that HEVC or any HEVC based image codec isn't infringing on any patent, particularly given the overalll broadness and insane amount of software patents there are out there. In short, anyone can be sued at any time, such is the 'wonderful' world of software patents.
    The problem is not *whether* HEVC is covered by any patents. It surely is. However, there is a patent pool, namely MPEG-LA, and some big players behind them. If those players decide that a technology is royalty-free (i.e. "no pay") for a specific application, you're in a pretty good situation, actually, as you can be sure that they will defend their patents.
    Quote Originally Posted by binarysoup View Post
    I'd like to check out those tests, sounds interesting. Do you have a thread here?
    No, JPEG tests are usually done internally, though I do have a couple of publications on that. However, you may want to try this: http://jpegonline.rus.uni-stuttgart.de/index.py (yes, it is really index.py, do not forget this). This is an online and somewhat cut-down version of the JPEG objective test scripts I'm usually run. Real tests are a bit more extensive and cover more quality indices, but that's at least something you can play with. Sorry, I haven't had the chance yet to include HEVC and JPEG XT there, soon to come, but I'm currently really busy with creating the latter, so please stay tuned.

  28. #25
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by thorfdbg View Post
    No, I really mean free, not fair.
    Well if the MPEGLA would offer a royalty free implementation of HEVC as an image codec it would put it on par with webp in that respect, but have you seen ANY indication of them even being interested in doing so? I sure haven't but you seem to be better involved.

    Quote Originally Posted by thorfdbg View Post
    MPEG can be used free of charge for web applications, that was a huge discussion.
    What does this statement relate to?

    Quote Originally Posted by thorfdbg View Post
    Do you call photography plus web a niche market?
    Again, as we've seen with webp, even a totally free (complete with open source permissive implementation) image codec isn't yet making any waves on the web, so no I don't see the 'plus web' of your argument as remotely realistic. That includes the potential of hardware image decoding as I'm certain that the image decoding in current 'devices' is very optimized for the appropriate cpu's.

    This is based upon my own 'anecdotal evidence' though, I haven't seen any actual data on this.

    Quote Originally Posted by thorfdbg View Post
    Actually, if anyone would that within ISO, this would be a clear case for anti-trust and might have consequences.
    Wait, isn't this what the JPEG group claim it does, that they only accept baseline 'features' for which the patent holder gives a 'free of charge' licence, and if not they will look elsewhere?

    And how on earth could choosing to not use patented technology for a standard be a clear case for 'anti-trust'?

    Quote Originally Posted by thorfdbg View Post
    Google does not want to standardize.
    Actually they seem very interested in standarizing on a royalty free codec (w3c, web rtc), which not surprisingly MPEGLA aren't that keen on as they want to collect royalties.

    Quote Originally Posted by thorfdbg View Post
    No, JPEG tests are usually done internally, though I do have a couple of publications on that. However, you may want to try this: http://jpegonline.rus.uni-stuttgart.de/index.py
    Interesting, is there any place where I can download the source (.ppm) image files to do my own tests?

  29. #26
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Quote Originally Posted by binarysoup View Post
    Well if the MPEGLA would offer a royalty free implementation of HEVC as an image codec it would put it on par with webp in that respect, but have you seen ANY indication of them even being interested in doing so? I sure haven't but you seem to be better involved.
    It actually doesn't look so bad, but this depends on a lot of players and I cannot read their mind.
    Quote Originally Posted by binarysoup View Post
    What does this statement relate to?
    That it did happen in the past that MPEG-LA provided royalty free licenses if enough pressure is there.
    Quote Originally Posted by binarysoup View Post
    Again, as we've seen with webp, even a totally free (complete with open source permissive implementation) image codec isn't yet making any waves on the web, so no I don't see the 'plus web' of your argument as remotely realistic. That includes the potential of hardware image decoding as I'm certain that the image decoding in current 'devices' is very optimized for the appropriate cpu's.
    The success of a codec depends on many factors, and WebP does not address them: Point 1 is that there are no vendors except google. There are no hardware manufacturers jumping on it, there are only few applications that depend on it, there is no toolchain to work with these images. That's exactly why JPEG XT is designed as it is: Backwards compatibility. For HEVC, we have more than a single player behind it (an industry consortium), and there are hardware manufacturers (camcorders) and a tool chain (video editing). That is, it is a known set of tools with an existing toolchain and existing applications. HEVC and JPEG XT have an ecosystem to work within. WebP, JPEG 2000 and JPEG XR had none, they had to establish one, which is the hardest of all steps. JPEG 2000 has now an ecosystem in DCI and medical. WebP and JPEG XR have nothing at all.
    Quote Originally Posted by binarysoup View Post
    Wait, isn't this what the JPEG group claim it does, that they only accept baseline 'features' for which the patent holder gives a 'free of charge' licence, and if not they will look elsewhere?
    No, not at all. There is a difference between vendors agreeing that they will release their base technology under royalty free terms, and a committee actively working with known patents. If there is a known patent, the ISO policy tells us that we should get such vendors on board rather than to avoid them. Then we can still talk about under which terms and conditions such a tool will be made available in the standard. As image coding is a smaller market, it did happen in the past that vendors considered it better to provide it free of charge and take a share in the market than to insist on RAND conditions. That is, it is a decision of the *vendor* and not of the committee how to release patents.
    Quote Originally Posted by binarysoup View Post
    And how on earth could choosing to not use patented technology for a standard be a clear case for 'anti-trust'?
    You confuse two things: "Patent free" and "Royalty free". One does not exclude the other. In fact, neither JPEG baseline nor JPEG 2000 baseline are patent free (or have been, at the time of writing). They are "royalty free", which is something different.
    Quote Originally Posted by binarysoup View Post
    Actually they seem very interested in standarizing on a royalty free codec (w3c, web rtc), which not surprisingly MPEGLA aren't that keen on as they want to collect royalties.
    If they are interested, why do they not react on JPEG calls? We tried to invite them several times and got no reaction. In fact, google is now like M$ was a couple of years ago: Too big, and not playing nicely.
    Quote Originally Posted by binarysoup View Post
    Interesting, is there any place where I can download the source (.ppm) image files to do my own tests?
    Yes/No/Maybe. The images in this dataset belong to two categories: First of all, popular images everyone knows (boat, lena, cafe, woman, bike...). Unfortunately, even though these images are popular, they are *not* available under a free license. The same goes for example for the Kodak test image set. DO NOT USE THESE IMAGES, AND DO NOT RELEASE THEM, the legal situation is complicated. It is just that rights holders haven't tried to enforce their rights so far, or rather gave up to enforce their rights (like playboy on lena). If you know where to look, you'll find these images, but I cannot provide them for this matter. We must be pretty careful these days what to release and what not. Then, there is a second set from JPEG AIC testing where we *do* have releases from the authors, and those images I can make available. Also, test images for JPEG XT will become available (this time, we *do* get things right). Thus, if you're interested, please drop me a mail and I'll compile you a nice image data test bed from our sources, at least for those where I do have releases. My mail is thor at math dot tu dash berlin dot de.

  30. #27
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by thorfdbg View Post
    It actually doesn't look so bad, but this depends on a lot of players and I cannot read their mind.
    Do you have anything public that you can point me to?

    Quote Originally Posted by thorfdbg View Post
    That it did happen in the past that MPEG-LA provided royalty free licenses if enough pressure is there.
    Are you referring to the 'free to distribute if h.264 video on web if users don't have to pay (which includes being subjected to ads)' ? Or something else?

    Quote Originally Posted by thorfdbg View Post
    Point 1 is that there are no vendors except google.
    Vendors? It's an open source royalty free image codec, why would you need 'vendors'?

    Quote Originally Posted by thorfdbg View Post
    There are no hardware manufacturers jumping on it
    Again, as I've stated I'm very unconvinced that hardware accelerated image decoding is a factor, and certainly webp could be hardware accelerated in Android devices which have hardware support for VP8 (and later VP9) just as with HEVC.

    Quote Originally Posted by thorfdbg View Post
    there is no toolchain to work with these images
    Are you talking about third party support? As it is a royalty free fully open source permissively licenced implementation it's no problem to include webp support anywhere, it's just that there's no apparent demand. Webp of course does come with tools to encode decode/convert webp format images.

    Quote Originally Posted by thorfdbg View Post
    For HEVC, we have more than a single player behind it (an industry consortium)
    And they must all agree (atleast those whose patents are being used) to licence a potential HEVC based image codec under a royalty free licence, let's just say I'm not holding my breath Still it would be great if it actually happened.

    Quote Originally Posted by thorfdbg View Post
    and there are hardware manufacturers (camcorders) and a tool chain (video editing). That is, it is a known set of tools with an existing toolchain and existing applications.
    Camcorders and video editing? Not sure how this paves the way for replacing jpeg on the web and desktop.

    Quote Originally Posted by thorfdbg View Post
    JPEG 2000 has now an ecosystem in DCI and medical. WebP and JPEG XR have nothing at all.
    JPEG 2000 has been around for what 10-12 years? Webp has been around for 2 years or so, and has atleast seen attempts at web utilization through facebook. But yes I agree that breaking in to the image 'ecosystem' is notoriously difficult, even without baggage of royalties and/or looming submarine patents.

    Quote Originally Posted by thorfdbg View Post
    One does not exclude the other. In fact, neither JPEG baseline nor JPEG 2000 baseline are patent free (or have been, at the time of writing). They are "royalty free", which is something different.
    Well obviously the problem with b-frames is not that it's patented, but that it's not royalty free. If it was under a royalty free licence then obviously Google would have no problem of using that technology, so pedantry aside, how could anyone be charged with 'anti-trust' for deciding not to use royalty-encumbered (there!) patented technology?

    Quote Originally Posted by thorfdbg View Post
    If they are interested, why do they not react on JPEG calls? We tried to invite them several times and got no reaction.
    Invites to what end? Make webp a JPEG ISO standard or to have them pool their resources into the next of many JPEG group attempts at creating a jpeg successor? Sounds as if your views on webp is somewhat clouded by Google ignoring you?

    I think it's important to focus only at the practical and technical details, like: is it royalty free, is there an open source implementation for easy adoption (not talking reference encoder/decoder here, but production quality code), is it substantially better than it's predessor (as in is it worth it in quality per bit), is there the potential for uptake (a real demand, interest in supporting it).

    These are in my opinon core points for 'success', webp currently does well in the first three points, but as for the fourth (demand, support interest) it's just not there at the moment and I doubt it ever will (we've seen many 'jpeg successors come with fanfare and then fade into oblivion'). I don't think a HEVC based image codec will do better here either but that remains to be seen (if one actually materializes).

    Jpeg as I see it is simply 'good enough' in terms of quality / size for the vast amount of use out there and to replace it will come at a much greater effort than introducing a new/updated video format.

    Quote Originally Posted by thorfdbg View Post
    Then, there is a second set from JPEG AIC testing where we *do* have releases from the authors, and those images I can make available. Also, test images for JPEG XT will become available (this time, we *do* get things right). Thus, if you're interested, please drop me a mail and I'll compile you a nice image data test bed from our sources, at least for those where I do have releases. My mail is thor at math dot tu dash berlin dot de.
    Glad to hear that there is an image test suite that is freely distributable, I'll send you and e-mail when I'm at my home machine, thanks.

  31. #28
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Quote Originally Posted by binarysoup View Post
    Do you have anything public that you can point me to?
    Of course not. That would be the matter of the press release of MPEG, but plans aren't yet advanced enough to allow anyone making a statement.
    Quote Originally Posted by binarysoup View Post
    Are you referring to the 'free to distribute if h.264 video on web if users don't have to pay (which includes being subjected to ads)' ? Or something else?
    To that, yes.
    Quote Originally Posted by binarysoup View Post
    Vendors? It's an open source royalty free image codec, why would you need 'vendors'?
    Because the world doesn't work like open source. If you want to make codec a success, you need industry to develop products around it.
    Quote Originally Posted by binarysoup View Post
    Again, as I've stated I'm very unconvinced that hardware accelerated image decoding is a factor, and certainly webp could be hardware accelerated in Android devices which have hardware support for VP8 (and later VP9) just as with HEVC.
    It is not about "hardware accelerating". It is about having products. That is, something an end user can make use of. In the end, the user of a phone, a computer or a web browser does not care which technology is inside. He wants to see a picture of grandmum. Or whatever. Whether that's WebP or JPEG does not matter. Whether "it works" does matter. And "working" requires an existing ecosystem.
    Quote Originally Posted by binarysoup View Post
    Are you talking about third party support? As it is a royalty free fully open source permissively licenced implementation it's no problem to include webp support anywhere, it's just that there's no apparent demand. Webp of course does come with tools to encode decode/convert webp format images.
    You seem to believe that something just needs to be open source and permissively licenced to be a success. It is certainly a factor of importance (and a reason why I'm pushing so hard to get XT sources freely available on the web), but it is not as important as you would think it is. See above: If there aren't any big players behind, it can be as "open" as you wish, and things will be forgotten. It requires a critical mass of applications and products to make a codec a success. It requires a transition strategy from today's technology to the new technology to allow customers to adjust. A
    Quote Originally Posted by binarysoup View Post
    And they must all agree (atleast those whose patents are being used) to licence a potential HEVC based image codec under a royalty free licence, let's just say I'm not holding my breath Still it would be great if it actually happened.
    I'm not MPEG. I'm next door. I can only tell them my point, and it's not that these people are ignorant.
    Quote Originally Posted by binarysoup View Post
    JPEG 2000 has been around for what 10-12 years? Webp has been around for 2 years or so, and has atleast seen attempts at web utilization through facebook. But yes I agree that breaking in to the image 'ecosystem' is notoriously difficult, even without baggage of royalties and/or looming submarine patents.
    And see how many people complain about that? The problem with WebP is exactly that: it does not offer a smooth transition to a new technology.
    Quote Originally Posted by binarysoup View Post
    Well obviously the problem with b-frames is not that it's patented, but that it's not royalty free. If it was under a royalty free licence then obviously Google would have no problem of using that technology, so pedantry aside, how could anyone be charged with 'anti-trust' for deciding not to use royalty-encumbered (there!) patented technology?
    You turn around in circles. The problem for google is that they don't want to pay royalties. This, however, is not an argument for or against a technology in ISO. ISO has rules you need to follow. It's not that the committee isn't open. Everyone can join, including google. They just did not decide to do so. They just want to do their own thing.
    Quote Originally Posted by binarysoup View Post
    Invites to what end? Make webp a JPEG ISO standard or to have them pool their resources into the next of many JPEG group attempts at creating a jpeg successor? Sounds as if your views on webp is somewhat clouded by Google ignoring you?
    Any of the above; there has been a call out there for years for new image coding technologies. Yes, of course, as standardization for imaging goes, that's the job of SC29WG1 aka JPEG.
    Quote Originally Posted by binarysoup View Post
    I think it's important to focus only at the practical and technical details, like: is it royalty free, is there an open source implementation for easy adoption (not talking reference encoder/decoder here, but production quality code), is it substantially better than it's predessor (as in is it worth it in quality per bit), is there the potential for uptake (a real demand, interest in supporting it). These are in my opinon core points for 'success', webp currently does well in the first three points, but as for the fourth (demand, support interest) it's just not there at the moment and I doubt it ever will (we've seen many 'jpeg successors come with fanfare and then fade into oblivion'). I don't think a HEVC based image codec will do better here either but that remains to be seen (if one actually materializes).
    See above. I believe you're wrong. MPEG haven't had free implementations in the past (HEVC was different, the reference implementation is released under MXM-licence, aka BSD license) and, before HEVC times, was not particuarly open for the public. Yet, they generated successful standards besides these problems. I don't disagree that you have a point, but it's not as important as you believe it to be. You need to have a critical mass of partners behind an activity to have a success. Whether that's open source or not is a secondary question.
    Quote Originally Posted by binarysoup View Post
    Jpeg as I see it is simply 'good enough' in terms of quality / size for the vast amount of use out there and to replace it will come at a much greater effort than introducing a new/updated video format.
    Yes, certainly, and I certainly don't disagree. There are, however, a couple of applications it cannot address, and that's why we extend it. Carefully. In a backwards compatible way, so nothing breaks. That's of utmost importance. HEVC still image might have other uses, but it's up to the market to decide.

  32. #29
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Quote Originally Posted by thorfdbg View Post
    You seem to believe that something just needs to be open source and permissively licenced to be a success. It is certainly a factor of importance (and a reason why I'm pushing so hard to get XT sources freely available on the web), but it is not as important as you would think it is.
    I think that when replacing a royalty free format with a strong open source implementation (jpeg) that is currently a de facto 'standard' you need to at the very least mirror those qualites. So yes, an image data format which isn't accompanied by a permissive open source implementation and isn't royalty free is dead on arrival in my opinion.

    I obviously dount think that is all that is needed to be a success, which should be easy to discern given what I've written of webp's poor chances of replacing jpeg as a 'web' image format despite being better than jpeg, royalty free and coming with a source code implementation.

    Quote Originally Posted by thorfdbg View Post
    Because the world doesn't work like open source. If you want to make codec a success, you need industry to develop products around it.
    Don't know if I buy into the 'world' versus 'open source' tirade. Open source is a part of the 'world', and as for what makes a success or failure is mainly down to the existance of a strong demand (something which is currently lacking when it comes to image formats imo), and support. Support is to a large degree a result of strong demand, but it's also a chicken and egg problem.

    Quote Originally Posted by thorfdbg View Post
    He wants to see a picture of grandmum. Or whatever.
    Exactly, which is why jpeg is practically almost impossible to replace or supercede, it does that. And furthermore he can save, edit, send it to someone else, put it on some other device, anything basically and it will work because it's supported everywhere. I see that you are arguing that a HEVC based image codec has a better chance of 'pushing' that support into platforms, but I can't see why they (MPEGLA) would go through the effort for a image codec they release royalty free.

    Quote Originally Posted by thorfdbg View Post
    The problem with WebP is exactly that: it does not offer a smooth transition to a new technology.
    It offers as smooth a transition as can be expected given that they can't force support to be written into applications, they can only offer an easy means of doing so (source code, royalty free).

    Quote Originally Posted by thorfdbg View Post
    The problem for google is that they don't want to pay royalties. This, however, is not an argument for or against a technology in ISO. ISO has rules you need to follow.
    Well obviously they won't implement something for which they need to pay royalties as they are releasing the codec royalty FREE. Sorry but your argument doesn't make any sense to me at all, can you point me to any of these ISO rules which states that you must implement a royalty-laden technology if it's available and useful for your standard even if it's royalty laden and if you are in turn releasing your standard royalty free?

    Quote Originally Posted by thorfdbg View Post
    Everyone can join, including google. They just did not decide to do so. They just want to do their own thing.
    I fail to see anything wrong with releasing a royalty free open image format for anyone to use, why would you need to apply for an ISO standard in order to do so, especially with the arcane 'rules' you've indicated above.

    Quote Originally Posted by thorfdbg View Post
    See above. I believe you're wrong. MPEG haven't had free implementations in the past (HEVC was different, the reference implementation is released under MXM-licence, aka BSD license) and, before HEVC times, was not particuarly open for the public.
    Well I think you are wrong I think times have changed and people / professionals have come to expect an open technology in terms of implementations, in part led by projects like ffmpeg and x264. As such I fully expect that an open source implementation of HEVC will emerge as the 'golden standard' amongst encoders just as with x264.

    And I'd say the same open source availability criteria holds true for an image format, if not more so as I'd imagine people are typically more likely to have images they want to be able to access from anywhere and from any program and at any time in the future than they have video for which they have such needs. I certainly (anecdotal evidence again) wouldn't convert any of my data to a format which isn't available in open source form and thus easy to port / support in any platform.

    Quote Originally Posted by thorfdbg View Post
    but it's up to the market to decide.
    Indeed, and I'd certainly welcome a HEVC based image codec if it was to arrive, as competition is something I always promote. I just hope it doesn't shoot itself in the foot by coming in a proprietary/royalty bound form.
    Last edited by binarysoup; 4th August 2013 at 12:49.

  33. #30
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Quote Originally Posted by binarysoup View Post
    Don't know if I buy into the 'world' versus 'open source' tirade. Open source is a part of the 'world', and as for what makes a success or failure is mainly down to the existance of a strong demand (something which is currently lacking when it comes to image formats imo), and support. Support is to a large degree a result of strong demand, but it's also a chicken and egg problem.


    Exactly, which is why jpeg is practically almost impossible to replace or supercede, it does that. And furthermore he can save, edit, send it to someone else, put it on some other device, anything basically and it will work because it's supported everywhere.
    Something WebP does not offer. There is no tool chain and no ecosystem for it.

    Quote Originally Posted by binarysoup View Post
    It offers as smooth a transition as can be expected given that they can't force support to be written into applications, they can only offer an easy means of doing so (source code, royalty free).
    That is *not* easy. You believe it's easy because you can program. Most people don't, and don't care. Either, the Os has support for it, and the applications they have has support for it, or it's useless.

    Quote Originally Posted by binarysoup View Post
    Well obviously they won't implement something for which they need to pay royalties as they are releasing the codec royalty FREE.
    You still don't understand how this works, do you? No, if you participate in a standard activity, you do not have to pay for the competitors technology to be part of the team. You offer your technology, and your technology cannot be refused on the ground that you request royalities for the technology. The output of this is a standard document which provides several options of what can be done, and how the format should be interpreted. Up to this point, nobody has to pay anyone anything, and in fact, the ISO does not regulate whether the technologies added to their standards are royalty free or not. They do care about whether the technology is available under FRAND conditions, but that's it. That is, ISO does not get access or a license to the technology (and yes, MPEG-LA *is not* part of ISO, but an independent company for exactly that reason. It's not ISOs business to manage patent questions).

    If you implement a technology for use in your products - this is the point where you need to check whether you have to make use of technology of your competitor, or choose not to. Some technology is essential for a standard, other is optional. It depends on how the standard is designed whether something is essential or not. For example, for JPEG, arithmetic coding was optional. Use it and pay rolayities (at least back then, you had), or don't use it and be fine. The market has made its decision. That's fine with me, I'm not a friend of patented technology. But it's not ISOs job to care about this.

    It is up to the standard committee then to decide which parts are optional and which are essential. But what you cannot do is to refuse a technology simply on the grounds of saying that it is not royalty free.
    Quote Originally Posted by binarysoup View Post
    I fail to see anything wrong with releasing a royalty free open image format for anyone to use, why would you need to apply for an ISO standard in order to do so, especially with the arcane 'rules' you've indicated above.
    There is certainly nothing wrong with it, except that partners probably want to get paid back for their investment. There is no such thing as free lunch, you know. There is nothing arcane about it actually either - ISO cannot have a say on something that is not part of their business. Patent handling is not their business. If you select *for standardization* activites technology based on other bases than technological reasons, then company A that refuses a technology from company B on this basis blocks market access for B, namely the advantages of standardization. *That* is where anti-trust comes it. Again, I'm not a lawyer, but you cannot block a competitor access to a market based on rules that are not part of the business of the process. ISOs matter is not in granding access to patents.

    As far as google is concerned, they have (hopefully) a strategy, and this strategy does not include standardization. Nobody can force anyone to join ISO. Whether it makes sense to join or not is part of a marketing strategy. Standards help to develop an ecosystem, and it seems google has chosen that they don't want or don't need that. What I personally think about this is not relevant for google. But what they certainly can't do is to come to a committee, and then block competitors with better or equal technology to contribute them to the standard simply on the basis that it is not royalty-free. That is, if WebM would have to be standardized, then people would make tests, and would have to compare the google technology of "golden keyframes" with b-frames, just to make an example. If b-frames would be better, and an ISO ballot would opt to replace golden frames with b-frames, then there would be nothing google could do against that. And that is why *I believe* that google did not choose to standardize. But that's only me.

    Standardization has two sides - on the one hand, you get access to a larger market and it is easier to reach a critical mass. On the other hand, the technology you provide will never come out unchanged of a standardization process, and you have to find ways to negotiate with many people and make compromizes. This is lengthy and costy and takes a lot of effort.

    Quote Originally Posted by binarysoup View Post
    I think times have changed and people / professionals have come to expect an open technology in terms of implementations, in part led by projects like ffmpeg and x264. As such I fully expect that an open source implementation of HEVC will emerge as the 'golden standard' amongst encoders just as with x264.
    You again confuse two things: Just because ffmpeg or x264 is released under an open source policy, these programs do not, again DO NOT, provide you access to the MPEG LA patent pool. (Or to be fair, I don't actually know. But I doubt that any vendor will provide *you* with an MPEG-LA license - which most likely involves some form of payment for a large user basis - and without requiring you to make a contribution to this payment).

    A software license regulates access to the source code, but not the technology. Access to technology is controlled by technology licenses, and these are nailed to patents.

    That is, if a professional uses the free x264 implementation, for example to sell videos or DVDs, he's (likely) still in trouble for not holding a valid license. MPEG-LA license regulation is not unreasonable and allow private use if you are below a certain count of codestreams distributed this way - don't ask me about the details because I'm really next door and not MPEG - you are fine. But again, you as a private user *DO NOT* hold a license on the technology by having the source code. This is exactly the reason why ISO reference code can never be published under GPL terms because GPL also claims control of the technology (or rather, requires the author to provide access to the patents, to be precise!). The MPEG MXM license does not do that - it would be a violation of ISO regulations to try to control the *technology licenses*.

    Quote Originally Posted by binarysoup View Post
    And I'd say the same open source availability criteria holds true for an image format, if not more so as I'd imagine people are typically more likely to have images they want to be able to access from anywhere and from any program and at any time in the future than they have video for which they have such needs. I certainly (anecdotal evidence again) wouldn't convert any of my data to a format which isn't available in open source form and thus easy to port / support in any platform.
    That might be you, and probably also includes me, but we're in the minority. But then again, you *do* use x264 probably here and there. Either legally because you obtained it as part of a product (a dvd recorder, a TV, a software suite) or "tunneling" under the MPEG-LA because your use count is under the threshold of what MPEG LA considers relevant. But at least formally, you would need a license, yes.

    Quote Originally Posted by binarysoup View Post
    Indeed, and I'd certainly welcome a HEVC based image codec if it was to arrive, as competition is something I always promote. I just hope it doesn't shoot itself in the foot by coming in a proprietary/royalty bound form.
    The HEVC reference software comes under MXM license, which is BSD. It provides you the right to do with the source code whatever you want. Including development of products. But it does not provide you with access rights to the technology. IOW, if you develop a product, let it be from the source code or by yourself, you still need to approach the MPEG-LA to get a license for your product.

    Yes, it's hard to divide the two aspects apart (technology and source code) and I'm certainly not a lawyer to explain that any better. If you want to know better, ask a lawyer, not me. Whether that's good or bad is another question. I'm just telling you how it works.

Page 1 of 2 12 LastLast

Similar Threads

  1. WebP (lossy image compression)
    By Arkanosis in forum Data Compression
    Replies: 62
    Last Post: 12th April 2019, 18:45
  2. WebP (Lossless April 2012)
    By caveman in forum Data Compression
    Replies: 32
    Last Post: 19th April 2013, 15:53

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •