Page 6 of 7 FirstFirst ... 4567 LastLast
Results 151 to 180 of 184

Thread: Is Encode.ru community interested in a new free lossy image codec?

  1. #151
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello,

    I was wondering would it be possible that image compression evolves as video compression? For video compression, it is admitted that it will orientate toward a multicodec world, where H.264,HEVC,VP9,AOM AV1 and VVC will coexist.

    I know that I am dreaming but would it be possible for image compression? For example, Google PIK would be used for high quality, AOM AV1 will be used for very high and extreme compression, and NHW Project could be used for shooting photos on mobile phones because it is very fast and very good for mid and high compression...

    I know there is a long way for the industry to share that opinion, but could image compression follows the way of video compression and of a multicodec world?

    Cheers,
    Raphael

  2. #152
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by Jyrki Alakuijala View Post
    Has it been confirmed by the ISO JPEG committee or JPEG XL subcommittee?
    No, probably because this is not what happened. AOM withdrew from the initiative. So what we have now is that its competitors teamed up to implement a joint verification model.

  3. #153
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by Raphael Canut View Post
    I know that I am dreaming but would it be possible for image compression? For example, Google PIK would be used for high quality, AOM AV1 will be used for very high and extreme compression, and NHW Project could be used for shooting photos on mobile phones because it is very fast and very good for mid and high compression...
    This would be a very inelegant design that would be hard to maintain. It is certainly possible that a codec contains multiple technologies to select from, but one has to find a balance between complexity and performance.

  4. #154
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    > AOM withdrew from the initiative

    It's a pity because AOM guaranteed a royalty-free codec.In the other alternatives for JPEG XL, do you have BPG (x265), XVC? If so do you have a strategy to avoid their patented technologies (CABAC is one example)?

    > It is certainly possible that a codec contains multiple technologies to select from

    Some people don't like this solution either...

    > but one has to find a balance between complexity and performance.

    Do you find that BPG (x265) has a good balance between complexity and performance? Or is it to computative to encode/decode?

    Cheers,
    Raphael

  5. #155
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by Raphael Canut View Post
    It's a pity because AOM guaranteed a royalty-free codec.In the other alternatives for JPEG XL, do you have BPG (x265), XVC? If so do you have a strategy to avoid their patented technologies (CABAC is one example)?
    The current candidates are from Google and Cloudinary, and they both prefer free technology. However, the ISO process cannot ensure that the resulting standard is free of third-party patents. WG1 (aka JPEG) is not allowed to select technology based on the patent situation. We can only select technology according to its performance, and we cannot evaluate the validity or completeness of the IP situation. That is, none of us is a legal expert and we cannot estimate in how far the proposed technology is free from patents from third parties, nor can we prevent third parties from contributing to the process.
    Quote Originally Posted by Raphael Canut View Post
    Do you find that BPG (x265) has a good balance between complexity and performance? Or is it to computative to encode/decode?
    My pure personal opinion on this is that this is going a bit overboard as far as the encoder complexity goes, but I have no numbers concerning the complexity of the proposed technology, so I can currently not judge. The problem with HEVC is that there are many block-splittings to be tried from, though if this algorithm is implemented in hardware, it does probably not matter in the end.

  6. #156
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    I think Google (PIK?) and Cloudinary are very good candidates, but I must say that I am very surprised that BPG and XVC did not submit? Maybe because they were not royalty-free?...

    The decoder complexity is also a "problem" for x265 (optimized HEVC), for example the NHW Project is at least 15x faster to decode...

    The "disturbing question": Do you think now AOM will try to impose its AVIF image compression codec without going through JPEG?

    Cheers,
    Raphael

  7. #157
    Member
    Join Date
    Jun 2015
    Location
    Switzerland
    Posts
    639
    Thanks
    183
    Thanked 235 Times in 142 Posts
    Quote Originally Posted by Raphael Canut View Post
    I think Google (PIK?) and Cloudinary are very good candidates, but I must say that I am very surprised that BPG and XVC did not submit
    I don't know their reasons not to participate and my speculation would be as good as anyone else's.

    The author of BPG is active in image compression and might answer if you ask him. He recently built a modern dct-based lossy image decompressor for the 25th International Obfuscated C Code Contest (and won). https://bellard.org/ Judges commented hilariously about the decompressor: "No specific obfuscation was needed as the algorithms already have a significant complexity."

    Quote Originally Posted by Raphael Canut View Post
    The decoder complexity is also a "problem" for x265 (optimized HEVC), for example the NHW Project is at least 15x faster to decode...
    My perception is that the standardization committees do not care a lot about the decoding speed for generic image compression. From what I saw during WebP, WOFF 2.0, and Brotli development is that actual people do care, and a global deployment of a slower algorithm can be more difficult than that of a faster algorithm.

    Quote Originally Posted by Raphael Canut View Post
    The "disturbing question": Do you think now AOM will try to impose its AVIF image compression codec without going through JPEG?
    The engineers that I know from the AOM effort are of high technical skill and with high ethics. Many of them have dedicated their lives to free software. They would not attempt to impose something that people don't need. However, they would make sure that their solution is available as an option.

    I believe that the inherent goals of AOM and JPEG XL (PIK and FUIF) development are the same: to be the best free codec. Just what best means is different for each group of developers. Different bitrate sweetspot, different amount of flexibility, different amount of hardware support, different complexity, different kind of artefacts, different HDR support, different psychovisual modeling, different amounts of asymmetry in encoding/decoding, different levels of generation loss, different levels of SIMD support, different software only decode speed, etc. etc.

  8. #158
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by Raphael Canut View Post
    I think Google (PIK?) and Cloudinary are very good candidates, but I must say that I am very surprised that BPG and XVC did not submit? Maybe because they were not royalty-free?...
    Nobody knows. In fact, WG1 cannot stop them from submitting, even though we expressed our "wish" to make this a royalty-free codec in the tradition of JPEG. That is, if they would have submitted, and evaluation would have shown that their technology is competative, we would have to select them.
    Quote Originally Posted by Raphael Canut View Post
    The "disturbing question": Do you think now AOM will try to impose its AVIF image compression codec without going through JPEG?
    I would expect so, yes. It is a bit sad since having them on board would have helped certainly.

  9. #159
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by Jyrki Alakuijala View Post
    Just what best means is different for each group of developers. Different bitrate sweetspot, different amount of flexibility, different amount of hardware support, different complexity, different kind of artefacts, different HDR support, different psychovisual modeling, different amounts of asymmetry in encoding/decoding, different levels of generation loss, different levels of SIMD support, different software only decode speed, etc. etc.
    Clarifying requirements is still one of the painful exercises we have to go through. It is currently more a "shopping list" than anything else, i.e. priorities have to be frozen. One of the take-aways from the discussion in Vancouver was that different people really wanted different things. AOM was very much behind a "video coder with a still image mode", but this is not quite on our agenda as I see it - this would rather fit into WG11 than WG1. It is more a "still image codec" with a nice simple extension for animation (in the sense of "animated GIF").

  10. #160
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    I don't know what happened with JPEG, but that's right that AOM can be tough in their communication.

    For example I contacted them about the NHW Project, I also contacted someone at Xiph.org/Mozilla which is a AOM founding member about AOM, I asked them, and the NHW Project belongs too to the free open source community, even if the NHW Project can not interest AOM, would they have advice about it, how to improve it, what would lack to it so that it would become of consideration for AOM, would they see a very specific niche for the NHW Project that they don't cover for example as it is very fast, but either I never got any answer or the only answer I had is:"Sorry all we can say, is that we are not interested in your codec (in its current state)"

    That's tough for me because without the help of AOM or JPEG, I don't see how the NHW Project could find an application... so its future is very compromised.

    Anyway any help is very welcome!

    Cheers,
    Raphael

  11. #161
    Member
    Join Date
    Dec 2011
    Location
    Cambridge, UK
    Posts
    425
    Thanks
    134
    Thanked 147 Times in 95 Posts
    Quote Originally Posted by thorfdbg View Post
    However, the ISO process cannot ensure that the resulting standard is free of third-party patents. WG1 (aka JPEG) is not allowed to select technology based on the patent situation. We can only select technology according to its performance, and we cannot evaluate the validity or completeness of the IP situation. That is, none of us is a legal expert and we cannot estimate in how far the proposed technology is free from patents from third parties, nor can we prevent third parties from contributing to the process.
    Is this really an ISO ruling or just the local WG1 take on it? From my involvements with MPEG it appears WG1 ploughs a very different furrow to the rest of ISO and infact disobeys certain ISO requirements (such as timely publishing of patents or even acknowledging the existance of them). It appears almost acceptable in that world to launch a submarine patent! Rather ugly.

    Some standards organisations operate under the legal requirement that all submissions must have patents clearly stated up front and include a penalty clause that failure to do so will automatically grant royalty free access. Some go further and require any stated patents must also be royalty free too. ISO's patent policy is a conscious decision of theirs, but they could if they wish have a different one.

  12. #162
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by JamesB View Post
    Is this really an ISO ruling or just the local WG1 take on it?
    On what precisely? We can only act on ISO guidelines, but not change it, and we surely obey ISO requirements. Of course we cannot, as I said, select technology based on the IP situation. In fact, as you say, we do not even collect IP information. It is up to the proponents to submit IP information to ISO (and not to WG1) where they are recorded. We can only evaluate technology on a purely technical basis. The only thing we can do is to express a "wish" for a freely accessible standard, but we cannot enforce, neither control it. WG1 cannot implement any particular policy to enforce timely patent submission, we are "only a group of friends" and not a legal entity. We can and do inform participants to report IPs, though. What happens if you have a patent, come to a meeting and do not report is what happened to "Compression Labs" aka "Forgent" that wanted to collect money for their 2D-VLC patent on JPEG. It was invalidated based on this "business practice". So submarine patenting invalidates the patent. This is also why we record who participates meetings. As far as MPEG is concerned, please do not confuse MPEG with MPEG-LA. MPEG aka WG11 is a working group of SC29 (same as JPEG). MPEG-LA is a company that handles (some) patents of MPEG participants, but whether an MPEG member is willing to contribute to the MPEG-LA patent pool is something MPEG cannot control. It is an offer for MPEG members. Unfortunately, the whole patent situation around HEVC is rather messy - there are at least three patent pools.
    Quote Originally Posted by JamesB View Post
    Some standards organisations operate under the legal requirement that all submissions must have patents clearly stated up front and include a penalty clause that failure to do so will automatically grant royalty free access.
    So does WG1. Just remember what "upfront" means. It means "upfront publication of the standardization", and not "upfront deciding for a particular technology". That is, participants must report to the ISO office. Not the working group, and only after most of the standardization process is done.
    Quote Originally Posted by JamesB View Post
    Some go further and require any stated patents must also be royalty free too.
    That is something ISO regulations do not allow. Some other standardization organizations may enforce this, but ISO can't.
    Quote Originally Posted by JamesB View Post
    ISO's patent policy is a conscious decision of theirs, but they could if they wish have a different one.
    I afraid it is quite unlikely that ISO will change its policies. It is a very slow moving institution.

  13. #163
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello,

    For those interested, I have added a new -l17 "extreme" compression quality setting to the NHW Project.I find this setting competitive with x265 (HEVC).Actually I have tested -l17 on 15 rather good quality images, and on these images I prefer the results of the NHW Project.

    We can still save 2.5KB per .nhw compressed files.

    More at: http://nhwcodec.blogspot.com/ .

    I start wondering if for extreme compression I should add a post-processing function in the decoder that would remove some aliasing for some images, just like HEVC intra has post-processing functions like deblocking and SAO filters.

    As detecting aliasing seems quite difficult, would it be possible to use for example machine learning, that would be trained to detect aliasing patterns?

    But first, do you find the results of the NHW Project at extreme compression good in their current state? Or should they be post-processed with aliasing and other?

    Any feedback is very welcome!

    Cheers,
    Raphael
    Attached Files Attached Files

  14. #164
    Member
    Join Date
    Dec 2011
    Location
    Cambridge, UK
    Posts
    425
    Thanks
    134
    Thanked 147 Times in 95 Posts
    Quote Originally Posted by thorfdbg View Post
    Just remember what "upfront" means. It means "upfront publication of the standardization", and not "upfront deciding for a particular technology". That is, participants must report to the ISO office. Not the working group, and only after most of the standardization process is done.
    That's not exactly upfront given an ISO process can take a couple years. I hear what you say - you select based on how well it works not based on IP. However if you have multiple competing suggestions all much of a muchness, then picking rather arbitrarily only to find out the one you picked was the only patented one would be silly.

    My experience of MPEG-G, primarily an MPEG format, is that the patents were most definitely not submitted to ISO until very late in the process and only then under duress, well past the time anyone could question whether it was a suitable road to go down. In short an entirely patent free format became patented (at least attempted to become - it's rather weak and unlikely to stand) due to one single inventor who didn't inform anyone else involved in that process. It lead to some ill will (and not just from me - I was the messenger, but I'd withdrew from participation well before then).

    If the contents of a patent make a significant difference then while I still don't agree with software patents I can understand where ISO come from and it makes some sense I guess. If it's some teeny tiny corner and is unlikely to be beneficial, then something is broken if that's the sole thing that prevents your format from being open, and in doing so prevents a large sector from adopting it. In such situations, it actually takes away from the efforts of others who gave their work freely.

  15. #165
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by JamesB View Post
    That's not exactly upfront given an ISO process can take a couple years. I hear what you say - you select based on how well it works not based on IP. However if you have multiple competing suggestions all much of a muchness, then picking rather arbitrarily only to find out the one you picked was the only patented one would be silly.
    No, it would not be "silly". It would be correct. Do you really believe a technical working group has the ability or knowledge to evaluate the correctness of claimed IP? Should engineers really decide upon IP? Would this mean that an ISO process is driven by lawyers, where we first need to check for a legal evaluation on the correctness of IP? You seem to forget who pays us, the engineers.

    There are three possible ways: Either, keep your work proprietary, and create great products and create income from selling your technology. This happens outside ISO.

    Or, second alternative, you go to ISO, hoping that you get some better market access by standardizing. In this strategy, you sell products, but have to accept competitors that profit from your technology because you opened it by creating a standard around it, but generate additional income from requesting licenses from them.

    Or, third alternative, you give your technology away for free, accept that everyone can use it free of charge, and generate income from products around the technology. If this is your strategy, ISO is not your best partner because ISO does not allow evaluation of the IP situation. Besides, one may wonder in how far anyone can - even if company A claims that their work is "IP free", you never know whether it really is, and whether company A has then the power to defend this position is just another question.
    You should understand that not every company has the ability to operate under the third model. Google is in the comfortable position of having such a big market share that they can, but not everybody else can. Thus, there is more than a single model how to earn money from technology, and it depends on your strategy and position in the market which one works for your company. ISO is only one model out of three.

  16. #166
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    > Or, third alternative, you give your technology away for free

    Sometimes you don't really have the choice...

    For example, from September 2008 to March 2012, the NHW Project was closed source, I was young and I had big hopes for my codec, for example in 2011-2013, the NHW Project was however not complete, just -l2 to -h3 quality settings, but for me it was very promising, actually I found it visually better than Google WebP and Xiph Daala (which were the codecs of that time, HEVC did not exist), but the industry totally ignored my work and was aboslutely not interested...

    So in March 2012, I open-sourced the NHW Project and gave it for free, because I was told that it was the only way that a company could test my work and maybe hire me to work on the NHW Project...

    7 years later, I realize that this strategy also did not work, but I am happy that people interested in image compression can test my work with the code and for free.

    Maybe with the recent improvements of the NHW Project, some of you could reconsider the NHW Project and could give me a contract?

    Do not hesitate to let me know!

    Cheers,
    Raphael

  17. #167
    Member
    Join Date
    Dec 2011
    Location
    Cambridge, UK
    Posts
    425
    Thanks
    134
    Thanked 147 Times in 95 Posts
    Quote Originally Posted by thorfdbg View Post
    No, it would not be "silly". It would be correct. Do you really believe a technical working group has the ability or knowledge to evaluate the correctness of claimed IP? Should engineers really decide upon IP? Would this mean that an ISO process is driven by lawyers, where we first need to check for a legal evaluation on the correctness of IP? You seem to forget who pays us, the engineers.
    That's not what I said. I don't think ISO should judge on the validity of a patent. I said in a close tie, it would be silly to go with the more encumbered option. ISO *should know* which ones are patented, if people play ball, because registering them is a requirement. The patent policy states "In this context, the words "from the outset" imply that such information should be disclosed as early as possible during the development". When all said and done though, the ISO patent system is a recommendation and not binding. When it comes down to disputes, they devolve it the working group in question and so that leads to different tolerances and even to completely different interpretations. Hence MPEG is very patent heavy and accepting of the situation while in some other committees it'd be unheard of. Frankly ISO feels like a dinosaur when it comes to computing issues, compared to IETF, W3C, etc.

    Anyway sorry to hijack this thread a bit. Raphael, at this point if you can't get in with the formal standards processes (either due to lack of affiliation with a country standards organisation or simply bad timing), then I applaud your strategy of making it open instead. This may still lead to parts of your technology being investigated and maybe ending up in some product down the line. I'd recommend choosing an appropriate license though to ensure your work has to be recognised.

  18. #168
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    @JamesB, thank you for your support Sir.

    In fact I think the main problem of the NHW Project is that it is not adapted to any image size.Recently, JPEG rejected the NHW Project for this reason, and AOM I think so too, because they told me that they were not interested in the NHW Project in its current state.

    When I open-sourced the NHW Project in March 2012, I planned to adapt it to any image size but I became more and more depressed (due to many problems), and I never could really work on the NHW Project from August 2013 to recently in February 2018, where I could start to work again on the NHW Project.

    However I realize that to adapt the NHW Project to any image size, I would really prefer a contract by an interested company to make it...

    Cheers,
    Raphael

  19. #169
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    432
    Thanks
    1
    Thanked 94 Times in 55 Posts
    Quote Originally Posted by JamesB View Post
    That's not what I said. I don't think ISO should judge on the validity of a patent. I said in a close tie, it would be silly to go with the more encumbered option.
    ISO does know and record IPs. Upfront standardization. Technical experts don't. How "close" does a tie has to be to allow the consideration of an IP as decision? How do we evaluate "closeness"? What happens if members disagree? ISO uses a "consensus" model, that is, all participants need to be fine with a decision. Now consider we have the situation that we have two almost equal technologies, one with IP and one without. Would you believe that the participant that brings in IP would agree in any particular definition of "closeness" that would rule out its technology? Look, this opens a can of worms, and ISO does not want to open it. They don't operate this way. Other standards organizations have other rules, and hence enforce other business models on their contributers.

  20. #170
    Member
    Join Date
    Aug 2018
    Location
    India
    Posts
    6
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Have a look at the https://miniimagesvideos.com . This is not an entirely new codec. However, we have found existing mechanism of SSIM to be very effective. Physiovisual comparison are too slow. Any codec which are production ready for such comparison?

  21. #171
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello,

    I have taken a look at the demo page.Just by curiosity, do you do like PackJPG, Lepton or paq8px for example with JPEGs that is you recompress the DCT coefficients with advanced context modeling and arithmetic coding?

    What do you mean by "production ready for" "physiovisual comparison"?

    Just a remark, the NHW Project has very bad PSNR and SSIM results, but actually I find it can be visually very good, so in a sense it is optimized for "physiovisual comparison"...

    Cheers,
    Raphael

  22. #172
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    In fact, I guess you're not recompressing the DCT coefficients with advanced context modeling and arithmetic coding as your .jpg's can be decompressed by standard JPEG decoder.

    So you're optimizing JPEGs with SSIM? Could you give us more details? Have you compared with Guetzli, MozJPEG?

    Cheers,
    Raphael

  23. #173
    Member
    Join Date
    Jun 2015
    Location
    Switzerland
    Posts
    639
    Thanks
    183
    Thanked 235 Times in 142 Posts
    Quote Originally Posted by Raphael Canut View Post
    So in March 2012, I open-sourced the NHW Project and gave it for free, because I was told that it was the only way that a company could test my work and maybe hire me to work on the NHW Project...

    7 years later, I realize that this strategy also did not work, but I am happy that people interested in image compression can test my work with the code and for free
    Also took 7 years for WebP to be approved by Mozilla and Microsoft.

    Which license did you opensource with? Perhaps it is a license issue...

  24. The Following User Says Thank You to Jyrki Alakuijala For This Useful Post:

    compgt (10th February 2019)

  25. #174
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    > Also took 7 years for WebP to be approved by Mozilla and Microsoft.

    Yes, that's very long... I realize that if Mozilla and Microsoft would like to study the NHW Project in 2019 (would be so great), it would take 7 years (2026) to approve it?

    Also don't you think the approving of WebP (based on VP8 ) was slown down due to the development since 2013 of HEVC and a little later of VP9 with the evolution of the classic intra prediction + residual coding dominant scheme with for example more intra direction modes, larger block size (up to 128x128 ), a lot more block splitting/partitioning, improved context modeling and arithmetic coding?

    Because today AVIF is twice better as WebP, and I could test BPG (x265) and that's right that it is better than WebP...

    > Which license did you opensource with?

    I've put a BSD license.Did I do it right?

    Cheers,
    Raphael

  26. #175
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello,

    I don't have worked on -l18 quality setting this week, but I have retested -l17 "extreme" compression quality setting of the NHW Project, and in fact I contacted you because I find it surprisingly good on rather good quality images!

    I don't know if my eyes play trick on me, but I have tested -l17 "extreme" compression quality setting on 30 rather good quality images, and I prefer the results of the NHW Project compared to BPG (x265 HEVC).I have adjusted the .bpg files to 1.5-1.8KB smaller than the .nhw files even if in theory we can save 2.5KB in average per .nhw files at -l17 setting.

    Jyrki, Pascal (WebP), thorfdbg (JPEG), Yann could you find time to test the NHW Project and share that opinion? Because it would be a really great news if the NHW Project would be better than HEVC at extreme compression, as it is furthermore royalty-free and a lot faster to encode/decode.If so in these conditions, do your company, organization could reconsider the NHW Project?

    Do you think also it is worth to recontact the Alliance for Open Media since I have added efficient very high and extreme compression?

    Any advice is very welcome!

    Cheers,
    Raphael

  27. #176
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello,

    For those interested, I have added a -l18 "extreme" compression quality setting to the NHW Project.Actually I find this setting very good and extremely competitive with HEVC.

    Compression of the wavelet DC image is really less efficient at extreme compression and we can really save 2.5KB per .nhw compressed files in average (this is confirmed when recompressing the .nhw files with a compressor such as paq8px or other).

    I have tested -l18 " extreme" compression quality setting on 30 rather good quality images, and on these images I clearly prefer the results of the NHW Project compared to x265 (HEVC)!!!

    More at: http://nhwcodec.blogspot.com/

    Any feedback is very welcome!

    Cheers,
    Raphael
    Attached Files Attached Files

  28. #177
    Member
    Join Date
    Jun 2015
    Location
    Switzerland
    Posts
    639
    Thanks
    183
    Thanked 235 Times in 142 Posts
    Quote Originally Posted by Raphael Canut View Post
    Any feedback is very welcome!
    When I look at your website the example images seem to be from 2012. It is not clear which if any example images were created with a modern version of the codec.

  29. #178
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello Jyrki,

    Yes, that's totally right! Many thanks for the suggestion!

    So I have updated the example images of my website with ones compressed with the new and latest version 0.1.6 (and not a June 2012 version...) at -l7 quality setting.

    Cheers,
    Raphael

  30. #179
    Member
    Join Date
    Nov 2011
    Location
    france
    Posts
    38
    Thanks
    2
    Thanked 26 Times in 18 Posts
    Hi,
    some visible artifacts:
    Click image for larger version. 

Name:	Lighthouse1DEC.png 
Views:	94 
Size:	380.6 KB 
ID:	6455

    hope it helps,
    skal/

  31. #180
    Member
    Join Date
    Aug 2018
    Location
    France
    Posts
    100
    Thanks
    7
    Thanked 5 Times in 4 Posts
    Hello Skal,

    Yes, the NHW Project can have discoloration and aliasing (edges) as artifacts -I must say that the NHW Project performs very poorly on this image actually...-.But despite these artifacts, the NHW Project has a good neatness in other hand, which can be visually more pleasant on many images according to my tests.

    Do these discoloration and aliasing artifacts are eliminatory for you? And for example it is not possible for you to release and support a codec with such artifacts?

    Many thanks!
    Cheers,
    Raphael

Page 6 of 7 FirstFirst ... 4567 LastLast

Similar Threads

  1. ERR_BLOCKED_BY_XSS_AUDITOR on encode.ru
    By khavish in forum The Off-Topic Lounge
    Replies: 3
    Last Post: 25th November 2017, 10:40
  2. Anyone interested in SAL annotations for their codec?
    By nemequ in forum Data Compression
    Replies: 2
    Last Post: 18th November 2017, 08:27
  3. Spam on Encode.ru ?
    By Fairy in forum The Off-Topic Lounge
    Replies: 5
    Last Post: 19th November 2008, 22:58
  4. Long live ENCODE.RU!
    By encode in forum Forum Archive
    Replies: 8
    Last Post: 15th April 2008, 20:53
  5. ENCODE.RU will survive!
    By encode in forum Forum Archive
    Replies: 8
    Last Post: 10th April 2007, 04:37

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •