Page 3 of 10 FirstFirst 12345 ... LastLast
Results 61 to 90 of 281

Thread: NanoZip - a new archiver, using bwt, lz, cm, etc...

  1. #61
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by Zonder View Post
    tested -cO with double ram -m1250m and got much faster c/d speed than with -m625m switch. Is it normal or i mixed up something, maybe i'll rerun -m625m test again!?

    Code:
    33.251% //   214kb/s //  1628kb/s(no/o) // *NanoZip v0.0a -cO -m1250m -r -nt -np
    33.466% //   146kb/s //  1102kb/s(no/o) // NanoZip v0.0a -cO -m625m -r -nt -np
    It's possible that such stuff happens sometimes.

    Quote Originally Posted by Bulat
    knowing you, i hope that you will made practical archiver.
    Sure. NZ's goal is to be practical, but the practicality must wait a bit until the compressors are more mature.

  2. #62
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post

    NanoZip 0.01 alpha

    A new version is at www.nanozip.net

    Changes:

    0.01 alpha, 14 July, 2008
    - various compression related fixes and tweaks mainly concerning
    multifile compression with nz_optimum1, nz_optimum2, nz_cm
    - fixed a bug (thanks to pat357 for reporting it)
    - fixed 2 bugs which may have caused crashes
    - all compressors should now report memory usage accurately
    and custom memory settings should be approximately followed
    - default/standard memory usage is now set to 400-450 MB
    - memory usage will be automatically reduced if no such amount
    of memory is available
    - fixed large file support (technically NZ format supports
    infinite file size) (thanks to Zonder for reporting the bug)
    - fixed '//' which appeared sometimes using the gui
    - help command option shows advanced options. some of these options
    may be a help for benchmarkers with non-configurable automated tools
    - gui will (by default) sort files by name
    - gui uses distinguishable color for nanozip archive files
    - many thanks for stephan busch for extensive testing

  3. #63
    Moderator

    Join Date
    May 2008
    Location
    Tristan da Cunha
    Posts
    2,034
    Thanks
    0
    Thanked 4 Times in 4 Posts

    Thumbs up

    Thanks Sami!

  4. #64
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts

    Nanozip v0.01 :"Archive corrupted. Error decoding (code 109)"

    It seems the error I reported last time still occurs on some files :
    I tried to compress my mailbox (TheBat .TBI file, in TAR =about 275 MB.)

    Code:
    G:\test\thebat>nz a -cO -m512m tbtbar_nz001_m512m_cOgr tb.tar
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: Intel(R) Core(TM)2 Quad CPU           @ 2.40GHz, cores # 4, memory: 2048/2048 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: tbtbar_nz001_m512m_cOgr.nz
    Compressor: nz_optimum2, using 543 MB memory.
    Compressed 275 472 384 into 170 166 622 in 2m 49.96s, 1 583 KB/s
    IO-in: 0.29s, 900 MB/s. IO-out: 1.44s, 112 MB/s
    
    G:\test\thebat>nz t tbtbar_nz001_m512m_cOgr
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: Intel(R) Core(TM)2 Quad CPU           @ 2.40GHz, cores # 4, memory: 2048/2048 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: tbtbar_nz001_m512m_cOgr.nz
    Compressor: nz_optimum2, using 347 MB memory.
    tb.tar 60 MB
    Archive corrupted. Error decoding (code 109)
    Same, I'd really like to provide you this file, but because it contains all my mails from 2008,
    we have to look for another way to give to you the possibility to have a look why this error still shows up.
    Last edited by pat357; 15th July 2008 at 00:58.

  5. #65
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by pat357 View Post
    It seems the error I reported last time still occurs on some files :
    I tried to compress my mailbox (TheBat .TBI file, in TAR =about 275 MB.)

    Code:
    G:\test\thebat>nz a -cO -m512m tbtbar_nz001_m512m_cOgr tb.tar
    ...
    Archive corrupted. Error decoding (code 109)
    That's depressing. It's not the same bug (as with shelwien's data) since it should be fixed. I guess you cannot send you mailbox to me, so I have to wait until some other such files are found.

    --edit: forgot to say thanks. You've already found two bugs. Many others have compressed gigabytes of data and nothing is found.
    Last edited by Sami; 15th July 2008 at 00:50.

  6. #66
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Tried compressing TortoiseSVN-managed ReactOS source code:
    Code:
    C:\>nz a -cO -m1200m -r reactos-cO-m1200m-r.nz001 C:\__Reactos
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: AMD Athlon(tm) 64 X2 Dual Core Processor 3800+, cores # 2, memory: 1499/204
    7 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: reactos-cO-m1200m-r.nz001.nz
    Overwrite reactos-cO-m1200m-r.nz001.nz (y/n)? Yes
    Compressor: nz_optimum2, using 1413 MB memory.
    ...ls/3rdparty/icu/source/data/mappings/ibm-9067_X100-2005.ucm 550 MB, 96%
    Then it crashed with generic "There has been a problem with application nz.exe which will be terminated" dialog shown. At this point taskmanager shows 1354 MB memory usage. There are almost 50 000 files in there - could this be the reason?
    Last edited by Black_Fox; 15th July 2008 at 03:40.
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  7. #67
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by Black_Fox View Post
    Tried compressing TortoiseSVN-managed ReactOS source code:
    Code:
    C:\>nz a -cO -m1200m -r reactos-cO-m1200m-r.nz001 C:\__Reactos
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: AMD Athlon(tm) 64 X2 Dual Core Processor 3800+, cores # 2, memory: 1499/2047 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: reactos-cO-m1200m-r.nz001.nz
    Overwrite reactos-cO-m1200m-r.nz001.nz (y/n)? Yes
    Compressor: nz_optimum2, using 1413 MB memory.
    ...ls/3rdparty/icu/source/data/mappings/ibm-9067_X100-2005.ucm 550 MB, 96%
    Then it crashed with generic "There has been a problem with application nz.exe which will be terminated" dialog shown. At this point taskmanager shows 1354 MB memory usage. There are almost 50 000 files in there - could this be the reason?
    Thanks for testing. The number of files doesn't matter. NZ is very aggressive on memory, so could you please repeat this test to see if it crashes again? If it does crash, can you tell me which ReactOS package should I download to test this myself? Also, can you tell me is your computer or memory overclocked?

  8. #68
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Sami View Post
    That's depressing. It's not the same bug (as with shelwien's data) since it should be fixed. I guess you cannot send you mailbox to me, so I have to wait until some other such files are found.

    --edit: forgot to say thanks. You've already found two bugs. Many others have compressed gigabytes of data and nothing is found.
    Sami, I've cut the file down to a 7MB parts : on one part I got the same error when testing.
    The file compresses down to 4.5 MB, so that's a lot easier to handle than the complete 270 MB.
    If you're interested, I can mail this file to you or upload it to something like Rapidshare. Just let me know....

    --edit : forgot to ask, but does NZ try to decode UUE/MIME parts before compression ? I ask because this file contains mostly UUE/MIME encoded attachments..
    Last edited by pat357; 15th July 2008 at 16:49.

  9. #69
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    OH i see that my favorit cue/bin-with-audiotracks compression program is updated.

    Some kind of special testing that would be of interest for you Sami?

  10. #70
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by pat357 View Post
    Sami, I've cut the file down to a 7MB parts : on one part I got the same error when testing.
    The file compresses down to 4.5 MB, so that's a lot easier to handle than the complete 270 MB.
    If you're interested, I can mail this file to you or upload it to something like Rapidshare. Just let me know....
    Nice. Thanks a lot. Please email me to sami at nanozip dot net and I will give you ftp access so you can upload the file, if that's ok with you.

    --edit : forgot to ask, but does NZ try to decode UUE/MIME parts before compression ? I ask because this file contains mostly UUE/MIME encoded attachments..
    No. I think it wouldn't help much since the decoded is probably zip or some other compressed thing.

  11. #71
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by SvenBent View Post
    Some kind of special testing that would be of interest for you Sami?
    Any kind of testing is appreciated. Simply compressing everything you can think of (not all at once preferrably, but in reasonable sessions like <100mb) and testing the archive is useful for testing stability. Compression tests, like finding some specific files where NZ does good or badly is useful as well. Of course compression comparisons with other programs is very interesting too, since there is currently very few tests to establish how NZ compares to other compressors.

  12. #72
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    Well no answer . then i'll just go around lobbying for Sami to implant ECM prefiltering on cue/bin, cloneCd, and mdf/mds files

    Hexen II.bin 712,434KB
    Hexen II.bin.ecm 687,299 kb
    Thas a saving of 25,135kb

    Hexen2.nz 322,561kb
    Hexen2noecm.nz 348,812kb
    The end files is 26.252kb smaller

  13. #73
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by SvenBent View Post
    Well no answer . then i'll just go around lobbying for Sami to implant ECM prefiltering on cue/bin, cloneCd, and mdf/mds files
    Hexen2.nz 322,561kb
    Hexen2noecm.nz 348,812kb
    The end files is 26.252kb smaller
    I like compression only. I would prefer not to even have a user interface for NZ, but then nobody else would even know about it. I see from your results that ECM works as it is with NZ, and we don't need it integrated. I think the time is better spent on thinking about compression than integrating other software.

  14. #74
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Sami View Post
    Nice. Thanks a lot. Please email me to sami at nanozip dot net and I will give you ftp access so you can upload the file, if that's ok with you.
    Check your mail please !

    These are the results from the 7MB chunk :
    Code:
    G:\test\thebat\test nz\crash>nz a -co -m512m pat357i_co_m512m pat357i.crash
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: Intel(R) Core(TM)2 Quad CPU           @ 2.40GHz, cores # 4, memory: 2048/2048 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: pat357i_co_m512m.nz
    Compressor: nz_optimum1, using 543 MB memory.
    Compressed 7 340 032 into 4 823 392 in 3.83s, 1 872 KB/s
    IO-in: 0.01s, 636 MB/s. IO-out: 0.03s, 135 MB/s
    
    G:\test\thebat\test nz\crash>nz t pat357i_co_m512m
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: Intel(R) Core(TM)2 Quad CPU           @ 2.40GHz, cores # 4, memory: 2048/2048 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: pat357i_co_m512m.nz
    Compressor: nz_optimum1, using 347 MB memory.
    pat357i.crash 6 820 KB
    Archive corrupted. Error decoding (code 109)
    
    G:\test\thebat\test nz\crash>nz a -cO -m512m pat357i_cOgr_m512m pat357i.crash
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: Intel(R) Core(TM)2 Quad CPU           @ 2.40GHz, cores # 4, memory: 2048/2048 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: pat357i_cOgr_m512m.nz
    Compressor: nz_optimum2, using 543 MB memory.
    Compressed 7 340 032 into 4 786 583 in 4.72s, 1 518 KB/s
    IO-in: 0.01s, 700 MB/s. IO-out: 0.03s, 117 MB/s
    
    G:\test\thebat\test nz\crash>nz t pat357i_cOgr_m512m
    NanoZip 0.01 alpha - Copyright (C) 2008 Sami Runsas - www.nanozip.net
    CPU: Intel(R) Core(TM)2 Quad CPU           @ 2.40GHz, cores # 4, memory: 2048/2048 MB
     *** THIS IS AN EARLY ALPHA VERSION *** USE ONLY FOR TESTING ***
    Archive file: pat357i_cOgr_m512m.nz
    Compressor: nz_optimum2, using 347 MB memory.
    pat357i.crash 6 820 KB
    Archive corrupted. Error decoding (code 109)
    
    Last edited by pat357; 15th July 2008 at 19:30.

  15. #75
    Tester
    Black_Fox's Avatar
    Join Date
    May 2008
    Location
    [CZE] Czechia
    Posts
    471
    Thanks
    26
    Thanked 9 Times in 8 Posts
    Quote Originally Posted by Sami View Post
    Thanks for testing. The number of files doesn't matter. NZ is very aggressive on memory, so could you please repeat this test to see if it crashes again? If it does crash, can you tell me which ReactOS package should I download to test this myself? Also, can you tell me is your computer or memory overclocked?
    It crashed two more times on the same file, so I guess it's fully repeatable. It's SVN revision 34338 (svn://svn.reactos.org/reactos/trunk/reactos), but I don't know, what data TortoiseSVN stores along the source code. My system is completely factory clocked, so no overclocking (some basic information about it is here)
    Last edited by Black_Fox; 15th July 2008 at 21:00.
    I am... Black_Fox... my discontinued benchmark
    "No one involved in computers would ever say that a certain amount of memory is enough for all time? I keep bumping into that silly quotation attributed to me that says 640K of memory is enough. There's never a citation; the quotation just floats like a rumor, repeated again and again." -- Bill Gates

  16. #76
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by Black_Fox View Post
    It crashed two more times on the same file, so I guess it's fully repeatable.
    ... here)
    Since it's cvs I doubt I could reproduce the error. If you don't mind uploading the files for me, write me to sami at nanozip dot net. If you don't, I understand since this is quite large piece of data.

  17. #77
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    iit seem 0.0.1 has some compression improvements in -co (optimum1?)

    Hexen2 Nz0.0.0 = 322,561KB
    Hexen2 Nz0.0.1 = 319,399KB

    this is a cue/bin image whit audio tracks.
    The .bin file is pre-filtered with ECM ( Oh i would wish Nz did that itself )

    after decompression allefiles where identical to the originals.
    Testet with both md5 and crc32 checksums

    Hexen2 Mission pack was also compressed, decompressed and certified OK by md5 and crc32.

  18. #78
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by SvenBent View Post
    iit seem 0.0.1 has some compression improvements in -co (optimum1?)

    Hexen2 Nz0.0.0 = 322,561KB
    Hexen2 Nz0.0.1 = 319,399KB

    Hexen2 Mission pack was also compressed, decompressed and certified OK by md5 and crc32.
    Yes, for some files this version may compress a bit better. Do you happen to have numbers for freearc/rar/etc for those files for comparison?

  19. #79
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    Quote Originally Posted by Sami View Post
    Yes, for some files this version may compress a bit better. Do you happen to have numbers for freearc/rar/etc for those files for comparison?
    No i never seem to figur out how to use Freearc.

    but i will have RZM/CCM(x) and blizzard numbers for you tomorrows with and without BWT,Delta and Rep pre-filtering.

  20. #80
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    Some figures for the blood2.iso file size and decompression times

    Code:
    Blood2.iso		623,204kb - 30s
    Blood2.nz		342,939kb - 110s
    
    Blood2.iso.rzm		353,034kb - 125s
    Blood2.iso.del.rzm	348,929kb - 130s
    Blood2.iso.del.rep.rzm	347,824kb - 135s
    Decompressed file verified with original by md5 and crc32

    please note that the 30s decompression time form the iso file is actually the time for my Extract.bat batch file to rune through with nothing to do.
    The 30 seconds delay is due to multithread synchronizations in my batch file.

    so you have to extrac around 30s from the posted decompression time to get the real decomrpession time.


    after this test I have finetunede my extract.bat with some finer time resolutions multithreaded synchronizations checks.
    Now the extra delay is down to around 3-4 seconds. if i dont use any delays my extract.bat will lock up one core for continues multithreaded synchronizations check, which again will results in slower decompression time.

  21. #81
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    Another iso image. This time of the game unreal

    Code:
    980517_1247.iso			519,083kb
    
    unreal.nz (0.01 -co)		272,428kb
    unreal.nz (0.00 -co)		274,163kb
    
    980517_1247.iso.rzm		252,280kb
    980517_1247.iso.del.rep.rzm	248,980kb
    
    980517_1247.iso.blz 		288,726kb
    980517_1247.iso.ccmx (6)	251,016kb
    
    980517_1247.rar (best)		280,430kb
    980517_1247.7z (ultra		258,544
    This time rzm with del and rep prefilter is the winner, it even beats out CCMx

    But its still nice to se that 0.01 have some compression improvements over 0.00

    both nanozip achieves was decompressed and files certified OK by md5 checking.
    Last edited by SvenBent; 16th July 2008 at 14:22.

  22. #82
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    an unreal expansion iso file


    Code:
    UnrealXP.iso			602,361kb
    
    UnrealXP.nz (0.01)		377,807kb
    UnrealXP.nz (0.00)		380,014kb
    
    UnrealXP.iso.rzm		358,339kb
    UnrealXP.iso.del.rep.rzm	355,577kb
    
    UnrealXP.iso.blz		394,554kb
    UnrealXP.iso.ccmx (6)		354,849kb
    
    UnrealXP.rar			391,101kb
    UnrealXP.7z			372,751kb
    same procedure as last test

  23. #83
    Programmer
    Join Date
    Jul 2008
    Location
    Finland
    Posts
    102
    Thanks
    0
    Thanked 1 Time in 1 Post
    Quote Originally Posted by SvenBent View Post
    Code:
    980517_1247.iso			519,083kb
    unreal.nz (0.01 -co)		272,428kb
    unreal.nz (0.00 -co)		274,163kb
    ...
    980517_1247.iso.del.rep.rzm	248,980kb
    ...
    980517_1247.7z (ultra		258,544
    This time rzm with del and rep prefilter is the winner, it even beats out CCMx
    The -co -mode may give such poor results because it's unfinished. Can you try -cO -mode. It doesn't lose to rzm/7z in my tests (comparing size only).

  24. #84
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by SvenBent View Post
    No i never seem to figur out how to use Freearc.
    Does something like "arc a -mx tst files" not work for you ?
    (you can best copy the 3 files arc.exe, arc.ini and arc.groups to your Windows dir )

  25. #85
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    Quote Originally Posted by pat357 View Post
    Does something like "arc a -mx tst files" not work for you ?
    (you can best copy the 3 files arc.exe, arc.ini and arc.groups to your Windows dir )
    Well i never tried. The list of commands and options scared me away.
    but since it only need those two I'l try looking into arc again

    BTW the reason I'm using -co is because of the decompression speed.
    using -co gives files that are to slow to decompressed. Anything slower than RZM and it starts to be to slow for my lickings
    Last edited by SvenBent; 16th July 2008 at 20:10.

  26. #86
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,497
    Thanks
    733
    Thanked 659 Times in 354 Posts
    how about reading docs? freearc is also highly compatible with rar

  27. #87
    Member
    Join Date
    Sep 2007
    Location
    Denmark
    Posts
    856
    Thanks
    45
    Thanked 104 Times in 82 Posts
    BTW does Freearc have some kind of brute force options?

    Right now i'm brute forcing with RZM and the different combos of delta and REP.
    prefilter, and also nanozip, 7-zip and winrar offcause.

    if freearc had somekind of brute forcing option it would be nice as i doesn't care about compression time

  28. #88
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,497
    Thanks
    733
    Thanked 659 Times in 354 Posts
    for compressing large iso images, if you have 2gb at least, i recommend you to try -m=precomp+exe+delta+lzma:1g:128:mc256 instead of -mx. REP isn't required in this case. this option works 5-10 times slower than -mx. you can try with other values of :mc (512, 1024) which may improve compression

  29. #89
    Member
    Join Date
    May 2008
    Location
    Antwerp , country:Belgium , W.Europe
    Posts
    487
    Thanks
    1
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Bulat Ziganshin View Post
    for compressing large iso images, if you have 2gb at least, i recommend you to try -m=precomp+exe+delta+lzma:1g:128:mc256 instead of -mx. REP isn't required in this case. this option works 5-10 times slower than -mx. you can try with other values of :mc (512, 1024) which may improve compression
    You didn't mention ECM, or will it not bring anything together with these options ?

    @SvenBent
    Anyway, it could be worth a try.

    arc a -mecm+precomp+exe+delta+lzma:1g:128:mc256 test youriso.iso
    Let us know your findings !

  30. #90
    Programmer Bulat Ziganshin's Avatar
    Join Date
    Mar 2007
    Location
    Uzbekistan
    Posts
    4,497
    Thanks
    733
    Thanked 659 Times in 354 Posts
    ok, your variant is better

Page 3 of 10 FirstFirst 12345 ... LastLast

Similar Threads

  1. Nanozip decompression data troubles
    By SvenBent in forum Data Compression
    Replies: 11
    Last Post: 12th January 2009, 23:25
  2. BWT - how to use?
    By m^2 in forum Data Compression
    Replies: 29
    Last Post: 6th November 2008, 03:01
  3. NanoZip huge efficiency issue
    By m^2 in forum Data Compression
    Replies: 9
    Last Post: 10th September 2008, 21:51
  4. enwik9 benchmark nanozip, bliz, m99, dark
    By Sami in forum Data Compression
    Replies: 6
    Last Post: 31st July 2008, 20:24
  5. DARK - a new BWT-based command-line archiver
    By encode in forum Forum Archive
    Replies: 138
    Last Post: 23rd September 2006, 21:42

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •