Page 1 of 2 12 LastLast
Results 1 to 30 of 32

Thread: mankind's century old hurdles overcame : was illusion not real only was indeed a most

  1. #1
    Member
    Join Date
    Apr 2012
    Location
    London
    Posts
    239
    Thanks
    12
    Thanked 0 Times in 0 Posts

    mankind's century old hurdles overcame : was illusion not real only was indeed a most

    ================================================== ================================
    mankind's century old hurdles overcame : was illusion not real only was indeed a most difficult to solve of all hurdle
    ================================================== ================================

    URL : https://groups.google.com/forum/#!to...on/hsuH-I6Pjm4

    Gentlemen :

    I now have the world's very 1st tested works random file compressor, compresses smaller reconstructs back exact same

    This was a very very slow 1st version so you would need have likeep 16-cores or more Windows to run for like 48-hours!

    Post your interests here OR to LawCounsels aol com with brief background detail...includes 1-line stating you agreed to keep ALL disclosures in commercial confidence.... exe will then be forwarded to your email address, you may generate your own random inputs to test verify

    I am also looking to recruit a few experts in this area to complete prototype2, to start at US$30 / hrs increasing, preferable with C# skills & conversant with lexicographic index ranking/ unrank

    serious enquiries only : )

    Warm Regards,
    LawCounsels
    Last edited by LawCounsels; 6th June 2014 at 15:13.

  2. #2
    Member Bloax's Avatar
    Join Date
    Feb 2013
    Location
    Dreamland
    Posts
    52
    Thanks
    11
    Thanked 2 Times in 2 Posts
    It seems like a reccuring pattern, doesn't it. Secrecy, weird writing - and magical compression.

  3. #3
    Member
    Join Date
    May 2012
    Location
    United States
    Posts
    323
    Thanks
    174
    Thanked 51 Times in 37 Posts
    Quote Originally Posted by Bloax View Post
    It seems like a reccuring pattern, doesn't it. Secrecy, weird writing - and magical compression.
    My thoughts exactly.

  4. #4
    Member Bloax's Avatar
    Join Date
    Feb 2013
    Location
    Dreamland
    Posts
    52
    Thanks
    11
    Thanked 2 Times in 2 Posts
    Besides, if you've managed to make ~the ultimate compressor~ that is so glacially slow that it wouldn't be worthwhile to use it over the traditional - and much more polished - alternatives, all while kicking the door in and shouting "A CENTURY OLD DIGITAL RIDDLE HAS BEEN SOLVED!" - why wouldn't you share the executable?
    (By the way, couldn't BARF be modified to 'compress' down to a checksum of the original file and then reassemble the original file byte by byte? Sure, decompression would take centuries - but think of the bit savings!! :^) )

  5. #5
    Member ivan2k2's Avatar
    Join Date
    Nov 2012
    Location
    Russia
    Posts
    35
    Thanks
    13
    Thanked 6 Times in 3 Posts
    Quote Originally Posted by LawCounsels View Post
    16-cores or more Windows to run for like 48-hours!
    Bitcoin mining?

    Yet another ultimate compressor world will never see.

  6. #6
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    But actually BARF is very fast. It compresses the Calgary corpus files to 1 byte each in less than 1 second, then 0 bytes on the second pass. Decompression is bit exact and just as fast.

    I would have already put the whole internet on a thumb drive if it weren't for that fact that random data compression is already patented. http://gailly.net/05533051.html

    Is comp.compression still around? I stopped reading it years ago because of all the nonsense...

    Oh yeah, I did once post on comp.compression my proof that if random compression is possible then P = NP. I was going to use it to claim the Clay Millenium prize, but of course bitcoin mining would be another application. The proof goes something like this: recursively compress any input in O(n) time, look up the compressed output in a fixed table of all possible computations, then decompress the output recursively in O(n) time.
    Last edited by Matt Mahoney; 6th June 2014 at 19:39.

  7. #7
    Programmer michael maniscalco's Avatar
    Join Date
    Apr 2007
    Location
    Boston, Massachusetts, USA
    Posts
    109
    Thanks
    7
    Thanked 80 Times in 25 Posts
    Quote Originally Posted by Matt Mahoney View Post
    Is comp.compression still around? I stopped reading it years ago because of all the nonsense...
    Sad thing too. If it weren't for encode.ru there would be no refuge from the nonsense either. And it's always the same pattern. Long winded, utterly nonsensical gibberish with the OP replying to their own posts (often to the exclusion of all others), and typically sounding as if they have just gone off their medications and are heading into some manic phase.

    The theme is so common that I often wonder if it's the same person under different persona. Let's hope this one goes away and leaves encode.ru in peace.

  8. #8
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    I have several terabytes of random noise that I'd really love to compress so I can get it off my RAID. I wish these people wouldn't joke around.

  9. #9
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    There's a simple algorithm, actually. Interpret the input as a binary number and subtract one. You can compress recursively until the input is 0 (an empty file). To decompress, just add 1.

    The only problem is it's kind of slow. I'm working on a GPU accelerated version...

  10. #10
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    Here's my single threaded version. LawCounsels, you owe me $5 for 10 minutes work

    Code:
    // rcomp.cpp - random file compressor. Public domain.
    #include <stdio.h>
    int main(int argc, char** argv) {
      if (argc!=4)
        return printf("To compress|decompress: rcomp c|d input output\n"), 1;
      FILE* in=fopen(argv[2], "rb");
      if (!in) return perror(argv[2]), 1;
      FILE* out=fopen(argv[3], "wb");
      if (!out) return perror(argv[3]), 1;
      fseek(in, 0, SEEK_END);
      int i, n=ftell(in);  // input size
      rewind(in);
      unsigned char* buf=new unsigned char[n+1];  // data to be compressed
      fread(buf, 1, n, in);
      if (argv[1][0]=='c') {  // compress
        if (n==0) return printf("Can't compress empty file\n"), 0;
        for (i=0; i<n; ++i) {
          if (buf[i]>0) {--buf[i]; break;}
          else buf[i]=255;
        }
        if (i==n) --n;
      }
      else if (argv[1][0]=='d') {  // decompress
        for (i=0; i<n; ++i) {
          if (buf[i]<255) {++buf[i]; break;}
          else buf[i]=0;
        }
        if (i==n) buf[n++]=0;
      }
      return fwrite(buf, 1, n, out), 0;
    }

  11. #11
    Member
    Join Date
    Apr 2012
    Location
    London
    Posts
    239
    Thanks
    12
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Bloax View Post
    Besides, if you've managed to make ~the ultimate compressor~ that is so glacially slow that it wouldn't be worthwhile to use it over the traditional - and much more polished - alternatives, all while kicking the door in and shouting "A CENTURY OLD DIGITAL RIDDLE HAS BEEN SOLVED!" - why wouldn't you share the executable?
    (By the way, couldn't BARF be modified to 'compress' down to a checksum of the original file and then reassemble the original file byte by byte? Sure, decompression would take centuries - but think of the bit savings!! :^) )
    OF IMPORTANCE IS THE SCIENTIFIC THEORETIC BREAKTHROUGH PROVEN HERE .... only tip of iceberg ( glacial slow like you said ) , but it opens up whole entire new fields & lightning-fast speed Prototype2/ Prototype3 will not be far off now ... You need forward your 1-line sentence agreeing to Confidentiality to verify .exe yourself , yes with any of your very own input random file ..... Warm Regards, Lawcounsels

  12. #12
    Member
    Join Date
    Apr 2012
    Location
    London
    Posts
    239
    Thanks
    12
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Matt Mahoney View Post
    There's a simple algorithm, actually. Interpret the input as a binary number and subtract one. You can compress recursively until the input is 0 (an empty file). To decompress, just add 1.

    The only problem is it's kind of slow. I'm working on a GPU accelerated version...
    OBVIOUS THIS WONT WORK , you end up needing same # of bits records total# iterations ... likely at least expanded by 1 bit more ALWAYS ..... Like those many Alchemists ( charlatan ? ) throughout ages , preceding very recent table-top Cold Fusion breakthrough by Andrea Rossi & the verified accompanied 'transmutations' into valuable elements ( proven vouched by US Navy which purchased one of his prototypes ... & a few other corporate entities also ) this Random compressions history has likewise been similar plagued by many many who put forth claims without putting up .exe proving their ill-conceived wild claims .... THIS NEEDN'T STOP ANDEA ROSSI DEAD from even to have started his successful self-funded investigations , even though for nothing of his doings he was immediate tarnished treated as 1 of continuing long line of charlatan Alchemists ... Actually some 7 years ago a group of Noble prize winners came together on board of a US corporations with US$100M public funds raised to attempt this 'NutCracking' & not unexpected did not succeed ..... ALL you are required now is simply forward 1-line sentence agreeing to Confidentiality to have this very 1st time verifiable .exe forwarded to you , you may verify this with any of your very own generated random input file THEN announce the success or failure to this Forum.... Warm regards, LawCounsels

  13. #13
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Matt Mahoney View Post
    There's a simple algorithm, actually. Interpret the input as a binary number and subtract one. You can compress recursively until the input is 0 (an empty file). To decompress, just add 1.

    The only problem is it's kind of slow. I'm working on a GPU accelerated version...
    It's not hard to compress white noise arbitrarily small if done lossy:

    Throw away the original. To decompress, generate white noise.
    Last edited by nburns; 7th June 2014 at 06:19.

  14. #14
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Matt Mahoney View Post
    But actually BARF is very fast. It compresses the Calgary corpus files to 1 byte each in less than 1 second, then 0 bytes on the second pass. Decompression is bit exact and just as fast.

    I would have already put the whole internet on a thumb drive if it weren't for that fact that random data compression is already patented. http://gailly.net/05533051.html

    Is comp.compression still around? I stopped reading it years ago because of all the nonsense...

    Oh yeah, I did once post on comp.compression my proof that if random compression is possible then P = NP.
    Since the existence of random compression would invalidate the pigeonhole principle, then I assume that by P = NP you mean Pigeon = No Pigeon. In other words, pigeons do not take up any space, therefore they do not need holes.

    By my analysis, if Pigeon = No Pigeon, then the other famous P = NP is trivially true, because P = NP = O(1).
    Last edited by nburns; 7th June 2014 at 07:40.

  15. The Following User Says Thank You to nburns For This Useful Post:

    Matt Mahoney (7th June 2014)

  16. #15
    Member
    Join Date
    Apr 2012
    Location
    London
    Posts
    239
    Thanks
    12
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by LawCounsels View Post
    OBVIOUS THIS WONT WORK , you end up needing same # of bits records total# iterations ... likely at least expanded by 1 bit more ALWAYS ..... Like those many Alchemists ( charlatan ? ) throughout ages , preceding very recent table-top Cold Fusion breakthrough by Andrea Rossi & the verified accompanied 'transmutations' into valuable elements ( proven vouched by US Navy which purchased one of his prototypes ... & a few other corporate entities also ) this Random compressions history has likewise been similar plagued by many many who put forth claims without putting up .exe proving their ill-conceived wild claims .... THIS NEEDN'T STOP ANDEA ROSSI DEAD from even to have started his successful self-funded investigations , even though for nothing of his doings he was immediate tarnished treated as 1 of continuing long line of charlatan Alchemists ... Actually some 7 years ago a group of Noble prize winners came together on board of a US corporations with US$100M public funds raised to attempt this 'NutCracking' & not unexpected did not succeed ..... ALL you are required now is simply forward 1-line sentence agreeing to Confidentiality to have this very 1st time verifiable .exe forwarded to you , you may verify this with any of your very own generated random input file THEN announce the success or failure to this Forum.... Warm regards, LawCounsels
    .......THERE IS AN INTERSTING HISTORY / ANECTDOTE : some 30/40 years Pons & Fleischman did observe thus reported finding Cold Fusion then , but had difficulties repeating their tests ( of finding tremendous excess heat produced than that input ) reliably WHEREUPON they were hounded marginalised by all establishment communities ..... I testing Rossi's recent successful work fully explained ' how very small unsuspected imperfections in electrolyte plate nano-scale wise ' had caused earlier Pon's & Fleischman's unexplained puzzling inability to reliable replicate test results ! .... At least Dr Fleischman had the very late satisfactions of seeing Rossi's wide acclaimed success , before himself passed away recently ..... Warm Regards, LawCounsels

  17. #16
    Member
    Join Date
    Jan 2014
    Location
    Bothell, Washington, USA
    Posts
    685
    Thanks
    153
    Thanked 177 Times in 105 Posts
    Quote Originally Posted by LawCounsels View Post
    .......THERE IS AN INTERSTING HISTORY / ANECTDOTE : some 30/40 years Pons & Fleischman did observe thus reported finding Cold Fusion then , but had difficulties repeating their tests ( of finding tremendous excess heat produced than that input ) reliably WHEREUPON they were hounded marginalised by all establishment communities ..... I testing Rossi's recent successful work fully explained ' how very small unsuspected imperfections in electrolyte plate nano-scale wise ' had caused earlier Pon's & Fleischman's unexplained puzzling inability to reliable replicate test results ! .... At least Dr Fleischman had the very late satisfactions of seeing Rossi's wide acclaimed success , before himself passed away recently ..... Warm Regards, LawCounsels
    Dear LawCounsels,

    To prove your compressor works, I suggest you try it on your posts and then post only the result.

    We all know an ideal lossy compressor would output 0 bytes for your posts, so see if you can beat that.

    Have a nice day.

    Best Regards,

    KC
    Last edited by Kennon Conrad; 7th June 2014 at 08:10.

  18. The Following User Says Thank You to Kennon Conrad For This Useful Post:

    Bloax (7th June 2014)

  19. #17
    Member
    Join Date
    Apr 2012
    Location
    London
    Posts
    239
    Thanks
    12
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Kennon Conrad View Post

    If we applied an ideal lossy compressor to your posts, the output would be empty. To prove your compressor, I suggest you try it on your posts. Have a nice day.


    KC
    .... You may only envisage a lossy solution possible in your own private worldviews, it's your private rights.... THIS POST began with "....
    & reconstructs exact same original file... " , has no place /not for unrelated 'lossy' techniques discussions

  20. #18
    Member
    Join Date
    Jan 2014
    Location
    Bothell, Washington, USA
    Posts
    685
    Thanks
    153
    Thanked 177 Times in 105 Posts
    Quote Originally Posted by LawCounsels View Post
    .... You may only envisage a lossy solution possible in your own private worldviews, it's your private rights.... THIS POST began with "....
    & reconstructs exact same original file... " , has no place /not for unrelated 'lossy' techniques discussions
    No, I generally look at what is best for the specific situation. In this case, lossy compression would be ideal.

  21. #19
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Location
    texas
    Posts
    449
    Thanks
    23
    Thanked 14 Times in 10 Posts
    I did not want to reply to this thread bit I worked with Dr Miles at China Lake. He explained why the elite struck up Universities failed to get the results. Even at China Lake will failed at first. But Miles looked at why the poor Southern Universities got clear result. When Miles got the results he was ordered by higher ups to stop. But the so called Cold Fusion does produce excess energy. The US is stupid I suspect the Chinese will corner the market in this field but it is real.

    I thought I few years ago a predominant physicist I think from MIT was trying to get the main stream physics community to get the government to invest some money into the research but he was killed. There is some physic guy from some where talking about it on youtube so maybe there are some physicist not trying to suck on the so called global warming feeding trough that may reconsider it. But since they are afraid to be honest about global warming its unlikely they we gave it a good look other than that I have nothing to say about this thread.

    https://www.youtube.com/watch?v=97ps7fTWOA8

    take care

  22. #20
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    The difference between cold fusion and random compression is that one has a nonzero probability of succeeding.

  23. #21
    Member
    Join Date
    Aug 2008
    Location
    Planet Earth
    Posts
    772
    Thanks
    63
    Thanked 270 Times in 190 Posts
    Congratulation after all that years trying, I can't wait to test it, see my pvt forum msg.

  24. #22
    Member
    Join Date
    Apr 2012
    Location
    London
    Posts
    239
    Thanks
    12
    Thanked 0 Times in 0 Posts
    Quote Originally Posted by Sportman View Post
    Congratulation after all that years trying, I can't wait to test it, see my pvt forum msg.
    Cheers ! good to hear from you again : ) ........ awaits to check your private email with 1-line confidentiality to then send to your private email .... Warm Regards, LawCounsels

  25. #23
    Member
    Join Date
    May 2008
    Location
    brazil
    Posts
    163
    Thanks
    0
    Thanked 3 Times in 3 Posts
    Is it possible to mix deterministic compression with random compression and data analisys?

  26. #24
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,471
    Thanks
    26
    Thanked 120 Times in 94 Posts
    You can throw away data at random and check if that satisfy you.






    PS:
    Or you can reorder letters in words and check if that improves compression: http://www.douglastwitchell.com/scrambled_words.php
    Last edited by Piotr Tarsa; 9th June 2014 at 17:05.

  27. #25
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Piotr Tarsa View Post
    Or you can reorder letters in words and check if that improves compression: http://www.douglastwitchell.com/scrambled_words.php
    I have seen that before. It raises the possibility of lossy text compression.

    Do you think people would mind reading wikipedia with the letters scrambled?
    Last edited by nburns; 10th June 2014 at 03:20.

  28. #26
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Matt Mahoney View Post
    The difference between cold fusion and random compression is that one has a nonzero probability of succeeding.
    I have a feeling that there's a nonzero chance that a glass of water at room temperature will spontaneously fuse. The probability is probably similar to the probability that a random file will consist entirely of zeroes, and therefore would be very easy to compress.

  29. #27
    Member
    Join Date
    May 2008
    Location
    brazil
    Posts
    163
    Thanks
    0
    Thanked 3 Times in 3 Posts
    Quote Originally Posted by Piotr Tarsa View Post
    You can throw away data at random and check if that satisfy you.






    PS:
    Or you can reorder letters in words and check if that improves compression: http://www.douglastwitchell.com/scrambled_words.php
    Random algos?

    Is zpaq + barf is a good way to reduce compression better than cmix?

  30. #28
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Folks, don't feed the troll. Whoever claims that "random data compression" is possible uses the words "random" or "compression" in a non-standard way. Yes, a binary iid process with p(0) != p(1) is compresssible, but that's typically not meant by the word "random data". Or rather, if the OP would have been able to provide a scientific solution to a problem, would have mentioned so. Did not, so all I can assume is a flat PDF, which is not compressible. Of course, in every random source you'll find subsequences or regions that look regular, just by pure chance. Can one compress just such subsets of data? Sure, but on average, the amount of data you need to annotate which regions are compressible and which are not eats up the compression gain. There's nothing new to prove here, it's all part of the Shannon result. Compression: Here, in this group, this typically means "lossless compression", thus the domain of the first Shannon theorem. There is also lossy compression, for which a rate-distortion bound exists. Again, as the OP did not say "lossy", one can only assume that this was not imposed. After all, a clear indication that even the simplest and most primitive aspects of "compression" and "randomness" had not been understood. There is really nothing else to say or to argue, the Shanon results are proven true, and thus apply, no matter what. If something else is meant (not "compression", not "random") then the OP is unable or unwilling to specify. Now think why.

  31. #29
    Member
    Join Date
    Apr 2012
    Location
    Stuttgart
    Posts
    437
    Thanks
    1
    Thanked 96 Times in 57 Posts
    Folks,

    don't feed the troll. Whoever claims that "random data compression" is possible uses the words "random" or "compression" in a non-standard way.

    Yes, a binary iid process with p(0) != p(1) is compresssible, but that's typically not meant by the word "random data". Or rather, if the OP would have been able to provide a scientific solution to a problem, would have mentioned so. Did not, so all I can assume is a flat PDF, which is not compressible.

    Of course, in every random source you'll find subsequences or regions that look regular, just by pure chance. Can one compress just such subsets of data? Sure, but on average, the amount of data you need to annotate which regions are compressible and which are not eats up the compression gain. There's nothing new to prove here, it's all part of the Shannon result.

    Compression: Here, in this group, this typically means "lossless compression", thus the domain of the first Shannon theorem. There is also lossy compression, for which a rate-distortion bound exists. Again, as the OP did not say "lossy", one can only assume that this was not imposed.

    After all, a clear indication that even the simplest and most primitive aspects of "compression" and "randomness" had not been understood. There is really nothing else to say or to argue, the Shanon results are proven true, and thus apply, no matter what. If something else is meant (not "compression", not "random") then the OP is unable or unwilling to specify.

    Now think why.

  32. #30
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by thorfdbg View Post
    Folks,

    don't feed the troll. Whoever claims that "random data compression" is possible uses the words "random" or "compression" in a non-standard way.
    I like how you started off with "don't feed the troll," then proceeded to feed the troll.

Page 1 of 2 12 LastLast

Similar Threads

  1. Replies: 11
    Last Post: 18th August 2008, 21:02
  2. Start of another BIG and really real Benchmark
    By Simon Berger in forum Forum Archive
    Replies: 31
    Last Post: 15th November 2007, 17:18

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •