Results 1 to 14 of 14

Thread: Precomp 0.4.3

  1. #1
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    539
    Thanks
    192
    Thanked 174 Times in 81 Posts

    Precomp 0.4.3

    Precomp 0.4.3 is out. This release improves stability and makes PCF files fully compatible between Linux and Windows versions.

    List of changes:
    • Linux: Added JPG recompression using PackJPG, full compatibility of Windows and Linux PCF files
    • Linux: zLib routines are now as fast as under Windows
    • Windows: Locked files can be read now, multiple Precomp instances can work on the same file
    • Windows: Static zLib and PackJPG linking (no DLLs needed)
    • Major GIF rewrite, Precomp now recompresses most GIF files completely
    • New switch -n for conversion of PCF files (bZip2 <-> no compression)
    • Compression switch -c uses n (none) instead of - now
    • Improved timing behaviour and activity indicator updates
    • Updated to PackJPG 2.5, this fixes JPG crashes
    • JPG misdetections are now visible in statistics
    • Filenames are stored in their original case now to improve platform compatibility
    • Fixed a bug with GIF files at positions above 2 GB
    • Fixed a bug that led to unnecessary data in PCF files
    • Fixed a bug in Base64 routines that freezed Precomp


    Have a look at http://schnaader.info/precomp.php
    http://schnaader.info
    Damn kids. They're all alike.

  2. #2

  3. #3
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    539
    Thanks
    192
    Thanked 174 Times in 81 Posts
    Thanks, fixed. File names are consistent now (precomp_v04.zip, precomp_v041.zip, precomp_v042.zip).
    http://schnaader.info
    Damn kids. They're all alike.

  4. #4
    Member Skymmer's Avatar
    Join Date
    Mar 2009
    Location
    Russia
    Posts
    681
    Thanks
    37
    Thanked 168 Times in 84 Posts
    Aha! Nice to see it. I will test it tonight on one problematic file which previously caused a lot of JPEG related crashes and incorrect GIF treatment (-1 size reporting, 4 GB temp file creation).
    Schnaader, thanks for update!

  5. #5
    Tester
    Stephan Busch's Avatar
    Join Date
    May 2008
    Location
    Bremen, Germany
    Posts
    872
    Thanks
    457
    Thanked 175 Times in 85 Posts
    thanks for the new version. It seems to be faster than previous versions. And I like the idea that all libraries are attached to the executable.

    So far no errors occured during test runs.
    Just wondering why it doesn't see the deflated content of WISE installers.
    Last edited by Stephan Busch; 2nd September 2012 at 01:14.

  6. #6
    Programmer schnaader's Avatar
    Join Date
    May 2008
    Location
    Hessen, Germany
    Posts
    539
    Thanks
    192
    Thanked 174 Times in 81 Posts
    Quote Originally Posted by Stephan Busch View Post
    And I like the idea that all libraries are attached to the executable.
    Yes, it simplifies things. The only downsides I see is that executables get bloated (Windows versions are 1 MB each, though UPX can compress them to 300 KB), but I could start releasing additional "slim" versions (e.g. only deflate without bZip2, packJPG) if it gets too extreme.

    A strange thing I noticed is that my (Windows) version of UPX won't compress the Linux executable. It says it needs 6 passes and stops with the message "premature end of file" after the third one. Also, the Linux version is only 430 KB. As JPG recompression works under Linux and the compiler settings are the same, I suspect the Windows version to be larger as needed, containing some unnecessary debug information or something like this.

    Quote Originally Posted by Stephan Busch View Post
    Just wondering why it doesn't see the deflated content of WISE installers.
    Looking at one of the WISE installers (WISE copernicagentbasic.exe, 3.546.360 bytes) with rawdet from Shelwien's reflate shows the streams we're looking for. The first one is:

    Code:
    beg=00004D2F last=1 type=2 size=3336 unplen=16824
    end=000060F0 bufbeg=00000004 bufend=00000000
    It detects 46 of those ranging from 479 to 103680 bytes decompressed. Precomp detects them too, but only in brute mode, and they are difficult to spot because of all the additional misdetected streams:

    Code:
    (0.56%) Possible zLib-Stream (brute mode) found at position 19759, windowbits = 15
    Can be decompressed to 16824 bytes
    No matches
    [...]
    New size: 3548304 instead of 3546360
    Time: 10 hour(s), 59 minute(s), 18 second(s)
    [...]
    Brute mode streams: 9/1195
    [...]
    -zl49,53 -d1
    Additionally to the fact that I need to catch up with reflate's diff mechanism that allows to recompress every detected stream (the GIF rewrite was kind of a training for this, though it's almost trivial compared to the complexity of deflate), I see a need for a better (and especially faster!) detection of raw streams - rawdet takes some seconds to show the list of streams, Precomp takes 10 hours for the complete run. A first step will be a "analysis only" brute mode that doesn't try to recompress, but only lists decompressed sizes above a certain threshold. After that, I'll have a look at changing the detection code for intense and brute mode, sorting many misdetected streams out, but trying not to lose small valid streams. The goal of course is implementing complete deflate recompression similar to reflate.
    Last edited by schnaader; 3rd September 2012 at 00:29.
    http://schnaader.info
    Damn kids. They're all alike.

  7. #7
    Administrator Shelwien's Avatar
    Join Date
    May 2008
    Location
    Kharkov, Ukraine
    Posts
    3,134
    Thanks
    179
    Thanked 921 Times in 469 Posts
    There's still more after current reflate though - looks like some kind of backward scanning is required as a workaround
    for streams with a weird first block (too short type-1 or compressible type-0/1), and also a way to handle incomplete
    blocks - originally i thought that its ok to not recompress broken blocks (because deflate blocks are usually short anyway),
    but there're types of data where broken blocks appear frequently (eg. VM images with fragmented fs).

  8. #8
    Member
    Join Date
    Sep 2010
    Location
    Australia
    Posts
    46
    Thanks
    0
    Thanked 0 Times in 0 Posts
    How can i use this with Freearc Precomp and Srep in one go?
    This results in nothing

    bat script
    del /q packeddata.rar
    arc.exe a -ep1 -dses --dirs -s; -lc- -di -i2 -r -mprecomp+lzma:a1:mfbt4:d16m:fb128:mc10000:lc8 6_pc_shaders.rar packeddata\*
    pause

    arc.ini file

    [External compressorrecomp]
    header = 0
    mem = 10
    packcmd = precomp -slow {options} -o$$arcpackedfile$$.tmp $$arcdatafile$$.tmp


    [External compressor:srep]
    ;options = l%d (minimal match length, default=512)
    header = 0
    packcmd = srep {options} -a1 -m3f $$arcdatafile$$.tmp $$arcpackedfile$$.tmp
    Last edited by Omnikam; 13th October 2012 at 16:31.

  9. #9
    Member Karhunen's Avatar
    Join Date
    Dec 2011
    Location
    USA
    Posts
    91
    Thanks
    2
    Thanked 1 Time in 1 Post
    For the previous point, precomp does not work on file streams like stdin and stdout. AFAIK, only the programs 7za | lzma | srep | zhuff | zhuffhc | xz do this. Precomp can't always find a match for zlib memory model and deflate parameters.
    This brings me to the point I am interested in

    To make precomp recognize a png/multi png as a png that can be "matched", I sometimes run a PNG (RGB) thru FFMPEG or libAV's AVCONVERT:

    ffmpeg -f image2 -i noprecomp.png -vcodec huvvyuv singleframe.avi

    then

    ffmpeg -i singleframe.avi -f image2 singlepng_now_recognized_by_precomp.png

    Actually the two separate commands above SHOULD be able to be combined, but my pipes are rusty. In any case, Precomp most always recognizes
    the reformed singlepng_now_recognized_by_precomp.png as memory level 8 and zlib level 6 ( "you can use -zl6 -m8 to speed up precomp" )

    So........ Is there a tool like optipng or pngwolf etc that "makes" a png always with these parameters so lprepaq, precomp, paq8o8pre etc can compress effectively ?

  10. #10
    Member toi007's Avatar
    Join Date
    Jun 2011
    Location
    Lisbon
    Posts
    35
    Thanks
    0
    Thanked 0 Times in 0 Posts

    are you sure people dont take a look at my version of lossy to zpaq?

    my post gets like 10% more compression just by dividing the coeficients of jpg in chrominance by 2
    its "jpeg to zpaq lossy compression" people could compare the end files in size and quality...

  11. #11
    Member
    Join Date
    Aug 2012
    Location
    Moscow
    Posts
    5
    Thanks
    0
    Thanked 0 Times in 0 Posts

  12. #12
    Member
    Join Date
    Aug 2012
    Location
    Moscow
    Posts
    5
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Also, precomp still create temp files, even on 16gb RAM. You say in 2008: "Hopefully, the next version will not use temp files anymore but do everything in memory".

  13. #13
    Member
    Join Date
    Mar 2013
    Location
    Windowless Basement
    Posts
    9
    Thanks
    0
    Thanked 1 Time in 1 Post
    An idea which came to me, which may or may not be useful.
    It might be nice to have a flag where precomp doesn't bother trying to find the zlib settings used to generate a zlib stream. In this case, when recompressing, precomp would try to use "efficient" deflate implementations to hopefully bring the size of the new deflated stream to <= size of the old deflated stream. If this happens to be smaller, it gets padded to the original size. If this size bound can't be met, then precomp would store the stream without inflating it.
    I know this breaks the idea of trying to be able to reconstruct the original file, but it could be useful if precomp can't determine how to reconstruct the deflated streams, or just to speed things up.

    I also have not tested at all how this works with most software - in particular, how zlib handles extra padding being appended to the end of a stream, so this idea might not even be viable. It conceptually seems plausible to me at this stage though. Anyone have any ideas on this?

    Oh and thanks for precomp - handy tool indeed!

  14. #14
    Member
    Join Date
    Dec 2010
    Location
    Scythia :)
    Posts
    22
    Thanks
    0
    Thanked 0 Times in 0 Posts
    Hi !

    I think i need a little help.
    A file is given ,only brute-recompress can be done ,but its very slow. I would like to use precomp043 with multiple instance ,but i dont know how.
    How can i do this ?
    Please ,i need an exact example.
    Thank you very much!

Similar Threads

  1. Precomp 0.4.1
    By schnaader in forum Data Compression
    Replies: 36
    Last Post: 7th October 2011, 16:36
  2. precomp precomp ??
    By srk3461 in forum Data Compression
    Replies: 2
    Last Post: 20th January 2011, 11:33
  3. Precomp (and Precomp Comfort) in 315 kb
    By Yuri Grille. in forum Data Compression
    Replies: 2
    Last Post: 1st April 2009, 19:40
  4. Precomp 0.3.5 is out!
    By squxe in forum Forum Archive
    Replies: 1
    Last Post: 20th August 2007, 14:55
  5. Precomp 0.3.3 is out!
    By squxe in forum Forum Archive
    Replies: 1
    Last Post: 20th July 2007, 17:27

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •