Results 1 to 20 of 20

Thread: packARC v0.7 releases

  1. #1
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts

    packARC v0.7 releases

    I don't want to make a new thread every time I increase the subversion number, so I'll post my future packARC updates here. For those who do not know, packARC is a "media archiver" uniting my packJPG, packMP3, packPNM and packARI compression algorithms under one hood. Although it will compress any file you throw at it, it will perform best for MP3 and JPEG and it will perform well with PNM and BMP. For other file types it highly depends on their content, in the worst cases packARC may even be outperformed by classic ZIP.

    packARC includes it's own source code and is now completely open source under the GPL.

    You may download the newest release of packARC v0.7 from my developer blog, more specifically from this subsite. The most recent version as of writing this is: packARC v0.7RC16.

    packARC download page on packjpg.encode.ru

    As always, I'm thankful for bug reports, if anything should come up.
    Last edited by packDEV; 26th February 2014 at 02:05.

  2. The Following 11 Users Say Thank You to packDEV For This Useful Post:

    avitar (28th January 2014),binarysoup (15th January 2014),Bulat Ziganshin (14th January 2014),Jaff (26th February 2014),load (14th January 2014),Matt Mahoney (15th January 2014),msmaniac (15th January 2014),samsat1024 (14th January 2014),Stephan Busch (14th January 2014),surfersat (23rd September 2014),SvenBent (31st January 2014)

  3. #2
    Member
    Join Date
    Feb 2012
    Location
    Sweden
    Posts
    59
    Thanks
    4
    Thanked 5 Times in 5 Posts
    Great, I'll be sure to test it and report back if I encounter any problems.

  4. The Following User Says Thank You to binarysoup For This Useful Post:

    packDEV (16th January 2014)

  5. #3
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    Testing on the 10GB benchmark system 4, packARC fails to extract 7483 of 83437 files and directories. packARC lists each file name followed by "(error extracting file)". The total output is about 7.9 GB. The missing files are many different types. I compiled for Ubuntu using the supplied build_packarc_linux after converting CRLF to LF. The files that extracted successfully compared OK. Compressed size is 4788875190 in 11243 sec.

    Listing shows 79431 files, 9765625kb compressed to 4676635kb (47.89%). (The number of files is right).

  6. The Following User Says Thank You to Matt Mahoney For This Useful Post:

    packDEV (16th January 2014)

  7. #4
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    Quote Originally Posted by Matt Mahoney View Post
    Testing on the 10GB benchmark system 4, packARC fails to extract 7483 of 83437 files and directories. packARC lists each file name followed by "(error extracting file)". The total output is about 7.9 GB. The missing files are many different types. I compiled for Ubuntu using the supplied build_packarc_linux after converting CRLF to LF. The files that extracted successfully compared OK. Compressed size is 4788875190 in 11243 sec.

    Listing shows 79431 files, 9765625kb compressed to 4676635kb (47.89%). (The number of files is right).
    Thank you, Matt! One other thing that you could still try is extracting with, the '-i' parameter (ignore CRC errors), ie:
    Code:
    packARC x -i bigarchive.pja
    In any case, I'll have to look into that. I've never had any unextractable files before (in Windows).

  8. #5
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    Alright, just a small update. I repeated Matt's test with the 10GB corpus in Windows (7), compressing and decompressing without any trouble at all. However, the archive size was different for me, so there might have been some trouble already when compressing, leading in turn to trouble when decompressing. I guess it might have to do with some of the platform specific stuff then, meaning either large file support or directory recursion.

    Can someone (maybe Matt?) provide a Linux build for me so I won't have to deal with this when setting up Ubuntu in a virtual machine?

    Also (sorry to have ignored this before), I'll globally change CRLF to LF in the next version, so noone has to got through that trouble again.

  9. #6
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    Attached is the packARC produced by my 64 bit Ubuntu compile that I used in the tests above. I will test again with -i.

    Edit: "packarc x -i -np ../usb/10gb.pja" produced no error messages and the output "-> extracted all 79431 files from the archive". However, a comparison shows 7483 of 83437 files and directories missing.
    Attached Files Attached Files
    Last edited by Matt Mahoney; 28th January 2014 at 23:33.

  10. The Following User Says Thank You to Matt Mahoney For This Useful Post:

    packDEV (5th February 2014)

  11. #7
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    packARC.exe extracted OK under Wine, although the Linux version fails. I used options -i -np for both. The archive was produced with the Linux version.
    http://mattmahoney.net/dc/10gb.html (system 4).

  12. The Following User Says Thank You to Matt Mahoney For This Useful Post:

    packDEV (5th February 2014)

  13. #8
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    Alright, I released a new version of packARC which may or may not fix the issue Matt pointed out. You may download it from my new developer blog:

    packARC v0.7RC15 download page

    Thanks go to the amazing folks here at encode.ru for letting me host this on their server and also for setting up Wordpress (I'm a complete noob in this stuff).


    EDIT:
    I guess I should elaborate on 'may or may not be fixed'. The thing is, even before fixing anything, the errors are not reproducable. With the earlier version, RC13, I had some errors in several trials, but I also had some trials where nothing went wrong at all. I did test the new RC15 with the 10GB corpus three times now and there where no errors at all. Now, does this mean it is fixed? I'm still unsure wether the stuff that I fixed now could have been the only reason behind the bug Matt Mahoney discovered for the Linux version.

    One thing is for sure, though: The Linux problems are in extracting the archive, not in adding files to the archive. If there are no errors when compressing files to the archive you can rest assured that everything went well and you'll be able to extract these files later (at least in Windows ).
    Last edited by packDEV; 25th February 2014 at 04:22.

  14. #9
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    It turns out yesterdays release was too quick: I finally found the actual reason behind all the trouble for packARC in Linux systems. It lies in temp file handling and my wrong assumption that fclose() works the same in every other OS as it does in Windows. Anyways, I got to fix this, and there's no actual need to test until the next release. More later!

  15. The Following User Says Thank You to packDEV For This Useful Post:

    samsat1024 (25th February 2014)

  16. #10
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    Glad you found it. I compressed and extracted 10gb with packARC.lxe and got the same error as before (7183 errors extracting files). packARC.exe extracts the same archive OK under Wine.

  17. The Following User Says Thank You to Matt Mahoney For This Useful Post:

    packDEV (26th February 2014)

  18. #11
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    Alright, I think I finally fixed all remaining known bugs. I couldn't run any bigger tests yet, but I'm very optimistic with this one. Grab packARC v0.7RC16 from the official download page.

    And also, a big thank you to Matt for testing the earlier version.

  19. The Following User Says Thank You to packDEV For This Useful Post:

    load (26th February 2014)

  20. #12
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    The Linux version of packARC 0.7RC16 still doesn't work. I attached the output of "diff -r 10gb tmp/10gb" where tmp/10gb is the decompressed output. The computer is the same as system 4 on the 10gb benchmark.
    Attached Files Attached Files

  21. The Following User Says Thank You to Matt Mahoney For This Useful Post:

    packDEV (19th March 2014)

  22. #13
    Member
    Join Date
    Sep 2011
    Location
    uk
    Posts
    237
    Thanks
    186
    Thanked 16 Times in 11 Posts
    Downloaded latest packarc 1.7. Ran on 10 year old x40 laptop, with only 1.5G memory (!!), 1 1.4Ghz processor running xp sp3. Just copied the packarc.exe from the zip - could not see anything else needed eg .dlls. Was this correct?

    Used it to try to compress 369 .jpgs, each of about 2.9M, family snaps taken with cheap 10M pixel samsung camera. Seemed to work, although it took long time eg 2 hours or so. Reduced the size from 1.25G to a 1.03G, 82% according to packarc l archive

    Qs
    1) Is this compression ie 80% or so what I might expect.
    2) Don't see any options to change stuff as is quite common with compressors like arc, 7zip etc. Eg less compression but faster, or use maximum of xxx M memory. Would be useful IMO
    3) Any other free windows jpg compressors I could try to get bit better compression? Or less compression, but faster?
    4) BTW, I just used c:\cam as archive name - it didn't generate an extension for it & generated file c:\cam.
    5) Think the output from l should just be capable of redirection as is 'normal' to a file eg packarc l archive > tmp.tmp or packarc l archive |more - can't see point of the 'press enter' at the end.
    6) I ran it like this: packarc a cam cam\*.* > log.tmp. The progress bar in log.tmp seemed to be on 1 line & very long & confusing - maybe there could be a switch to replace it with one line per new file added?

    Thanks for useful utility...

    John

  23. The Following User Says Thank You to avitar For This Useful Post:

    packDEV (19th March 2014)

  24. #14
    Member
    Join Date
    Sep 2011
    Location
    uk
    Posts
    237
    Thanks
    186
    Thanked 16 Times in 11 Posts
    Found another funny. packarc does not appear to set cmd `errorlevel` - seems to be always set to 1. Should be 0 for success, or some other if error eg cannot find archive or no files processed, or whatever. This is important in a batch file so one can take action if packarc fails.

    John

  25. #15
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    Sorry for the long radio silence and thanks Matt, for testing again. Even now, with your diff.txt file I can't reproduce that error. Duh... I need to find the cause for this some other way. Does the number of compressed files show as correct? And, does the number of extracted files, too?

    Avitar, that's a long list, but I'll try to reply to everything.
    1.)&3.) Yes, for JPEG 80% is actually a good compression ratio. You may also try the most recent version of Matt Mahoneys excellent PAQ8 or (if your willing to pay) StuffIt. Other than that, other solutions should have worse compression ratio or are not truly lossless.
    2.) Well, the problem here is that packARC actually contains various compression libraries, for most of which no such settings are available (= wouldn't make sense anyways). Add to this the fact that packARC is intended for media files (JPG/MP3) archival rather than to be an universal compressor. I'm thinking about it, tough, but it would only affect non JPG/MP3/BMP/PNM files.
    4.) You have to generate the extension yourself, it is actually intended this way. Extension is something to be handled by the frontend, and the included frontend is very basic.
    5.) I didn't fully understand this . There's a command line option to skip the <Press ENTER>, though, and that pause is for when using the packARC executable via drag and drop.
    6.) Yes, I know, packARCs output is not very good suited for storing in a log file. I'll think about improving this, but that's frontend stuff again and I'm mainly concentrating on the compression library itself.
    7.) Setting errorlevel: Can you show me some source code example? I guess a good way to do this would be errorlevel 0 for all files processed, and something else when errors occured.

  26. #16
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    Quote Originally Posted by Matt Mahoney View Post
    ... packARC lists each file name followed by "(error extracting file)". The total output is about 7.9 GB. The missing files are many different types. ...
    Forgot about this, so my earlier question is already answered I guess. So, the files extract okay in Windows, but not in Linux, from the same file. That's really strange, I guess I still need to think a little about possible causes for this.

  27. #17
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    I haven't looked at the code, but maybe if extracted files are not closed before opening the next one that the limit on open files is exceeded. I recall that "packarc l archive" shows correct contents. What Linux system are you using for testing?

  28. The Following User Says Thank You to Matt Mahoney For This Useful Post:

    packDEV (20th March 2014)

  29. #18
    Member packDEV's Avatar
    Join Date
    Oct 2011
    Location
    Germany
    Posts
    37
    Thanks
    10
    Thanked 44 Times in 9 Posts
    I'm using XUbuntu 32bit in VirtualBox - I didn't manage to get any 64bit Linux to run in the virtual machine. Also, thank you for the idea! Something like unclosed file handles seems plausible, especially if it is not always the same files that generate errors. I guess I'll try finding out with valgrind, as described here:
    http://nirbhay.in/2013/10/how-to-det...-file-handles/

    More later!

  30. #19
    Member
    Join Date
    Sep 2011
    Location
    uk
    Posts
    237
    Thanks
    186
    Thanked 16 Times in 11 Posts
    7.) Setting errorlevel: Can you show me some source code example? I guess a good way to do this would be errorlevel 0 for all files processed, and something else when errors occured.[/QUOTE]

    Thanks for answers. Pity, but I can see why there aren't any tuning parameters re memory or speed. I guess some of the libraries do take parameters, but not the ones you use. I've tried packarc on my fx4100 4Ghz 4 processor system (a lot faster than my old laptop!) but packarc only seems to use 1 (or 2?) processors - guess this is same problem re the libraries?

    Matt suggested paq8 - I tried paq8px and even with parameters set for speed it is a lot slower (20% speed!) than packarc tho' it does give a better compression eg about 70%? size (I'm still testing). On balance tho' this is not really worth it - for my 400 files packarc takes 1-2 hours, paq 8-10 hours, on my old 1.4GHz laptop.

    Re errorlevel the key thing is at present it always seems to be 1! The 'compression convention' for it's use seems to be 0 as you suggest if the archive is found, parameters ok and at least 1 source file found & processed, else 1. Alternatively 0 for a success, 2 for parameter error and/or not enough memory, 1 for missing archive, 3 for no source files. This means that one can check errorlevel not 0 & eg send email saying 'compression failed' or retry or whatever.

    John

  31. #20
    Member
    Join Date
    Apr 2009
    Location
    here
    Posts
    202
    Thanks
    165
    Thanked 109 Times in 65 Posts
    i made an x64 build of this, it runs slightly faster (core i5-3230m)

    128 jpg files (~ 540mb):
    official build: 612s
    x64 build: 535s

    19 mp3 files (~170mb)
    official build: 102s
    x64 build: 92s

    i haven't tested any other files.

    it is built with GCC7 dev and the newest sources of packarc i could find.

    /edit: i'm doing this just for some fun, nothing more behind it. i know it doesn't offer anything special.
    if someone can use it, fine. if not, fine too.
    Attached Files Attached Files

Similar Threads

  1. packARC v0.7RC11 GPL release (and more)
    By packDEV in forum Data Compression
    Replies: 11
    Last Post: 11th January 2014, 15:48
  2. Intel Releases A Boatload Of Haswell Documentation
    By Bulat Ziganshin in forum The Off-Topic Lounge
    Replies: 1
    Last Post: 1st January 2014, 03:37
  3. packARC v0.7RC6 (Release Candidate)
    By packDEV in forum Data Compression
    Replies: 14
    Last Post: 7th December 2013, 20:06
  4. sourceforge.net => no NEW releases?!
    By Vacon in forum The Off-Topic Lounge
    Replies: 0
    Last Post: 16th August 2009, 14:01

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •