Results 1 to 26 of 26

Thread: How do you backup?

  1. #1
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,471
    Thanks
    26
    Thanked 120 Times in 94 Posts

    How do you backup?

    How do you backup your precious data?

    Recenlty I've damaged my 1 TB hard disk so it's unusable now (I have yet to hand it to some data recovery lab to check if data can be recovered). Unfortunately I haven't made (almost) any backup.

    I had some slightly bad experiences with backup software. I tried using Ubuntu Cloud for hosting my data, but soon I've discovered that when I update files many times repeatedly, the client starts having very long delays before uploading data. The data I've tried to keep on cloud was, among others, some my projects that I've actively worked on and frequently recompiled. I was also using some software that was supposed to keep a mirror on another disk. It wasn't working automatically AFAIR and I wanted it to be bidirectional - ie program should update the older copy automatically. Another problem was that I was getting some weird read errors from disk under some circumstances (eg enabling AHCI mode in BIOS resulted in read errors). In the end the program required too much overseeing and manual actions for my lazy nature.

    A good backup software for me would be:
    - automated and easy to use,
    - able to add Reed-Solomon codes to backed up data,
    - checksumming the data throroughly,
    - able to keep old copies for files in selected folder,
    - albe to ignore selected folders/ files/ particular filename/ path patterns,
    - able to detect that I have moved some folders (eg by comparing checksums and file attributes),
    - work for different types of media and services at once, eg disk-drives, pendrives, cloud storage, DVD-ROMs, etc
    - work on Ubuntu, Windows 8, Android, etc
    - is free or cheap,

    Probably there's no such program and for different types of data people use different backup strategies.


    Question is:
    How do you backup you data? Do you use different strategies for different type of data (eg depending on frequency of changes)? Which programs and services do you use and how? What you find comfortable and uncomfortable about them?

  2. #2
    Member FatBit's Avatar
    Join Date
    Jan 2012
    Location
    Prague, CZ
    Posts
    189
    Thanks
    0
    Thanked 36 Times in 27 Posts
    I deem some of pcs/drives as a masters and from these places I copy all files via Microsoft Robocopy based and tuned batches to external drive. Rather I prefer bigger drive than to compress files. I do not use cloud etc.

  3. #3
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    I use zpaq to an external USB hard drive. I don't back up the OS since zpaq only backs up regular files, not ACLs, registry, devices, links, boot sectors, etc. To exclude directories like Windows and temp files that change frequently I have a .bat file like this:

    zpaq add e:\backup c:\ -not c:\Users\Matt\AppData c:\Windows c:\ProgramData c:\tmp %*

    but you could do something similar in Linux. %* is so I can pass other options. It doesn't do error correction but I am thinking about adding this. I haven't tried Android but I suppose I can copy files from my phone to my PC using a micro SD card reader before doing the backup.

    I like that when I change a file, the backup will save the new version but also keep the old version. Also when I rename a directory, it uses deduplication so it will not take up any more space.

  4. #4
    Member
    Join Date
    Aug 2013
    Location
    Greece
    Posts
    55
    Thanks
    37
    Thanked 21 Times in 15 Posts
    I do not use any "easy" backup software or cloud solutions for personal use. Since I am using Linux I just use same bash scripting, crontab, rsync, zpaq and par2 and so far it works reasonably well for my data and requirements. It sometimes also needs some manual labor which gets frustrating depending the situation....

    Specifically:

    I have two computers in two locations that run almost 24/7 and are the main repositories of my personal data (OS:Ubuntu Linux + Repository: zfsonlinux 4 disks RAIDZ2). Both locations were connected (using pfSense) via Site-to-Site IPSec over the Internet, but since I do not have static ip's anymore I changed to using openvpn and dyndns.org.

    I have 2 type of personal data:

    Low-volume frequently changing files (mail, projects repository, personal documents, knowledge-base files, documentation, etc):
    I use a nightly rsync script to sync files between both locations. Total size: ~50GB, synced data/day: 10M - 300M depending on how many changes I made.
    If during the day I stupidly lose some files, I can grab them from the remote computer (before it rsyncs).
    Also after the rsync finishes it executes zpaq on both computers (via ssh) in order to archive current version of files for historical reference or to restore from unpredicatable/human error to a previous version. After that it executes par2 on the resulting zpaq file to create recovery files (it's overkill since the files are already on RAIDZ2). Before using this I was using single disks on ext4 and the script would do a par2 check before adding to zpaq and report/log/email any error.

    High-volume personal data (photos, videos, audio, ...): Total: >1TB After I add new content or change/move files and or folders, I use a manually run script that rsyncs with an external 2TB 2.5' USB ext4 HDD and whenever I go to the remote location, I carry it with me and rsync it with the computer there. I do not do compression nor integrity checking on these because it would take too much time. The script also copies/overwrites on the USB disk the current zpaq+par in an alternate folder excluded from automated rsyncs.

    Also every month or two (if I remember it) I sync an external disk that I keep at my parents house in the event I catastophically lose all data on both locations simultaneously.

    You could just use 2 external disks, rotating them day by day (in case one fails), using par2 check first, zpaq -add and immedietly after par2 create.

    Since zpaq does not support inclusion/exclusion lists, I have used something like this (I cannot think right now a .cmd equivalent for windows):
    bash -c "zpaq -add archive.zpaq $(cat "include.txt" | tr "\\r\n" "\\n" | tr "\\n" " ") -not $(cat "exclude.txt" | tr "\\r\n" "\\n" | tr "\\n" " ")"
    (bash -c is required in order to maintain double quotes inside the txt files)

  5. #5
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    I've gone through a lot of hard disks since I started using computers, and sometimes I've gone to the trouble of making DVDs and trying to preserve things. But I never end up looking at what's on them. Most data is not worth the trouble of keeping. For the data that is, I make sure to spread it around in more than one place. At least one of those copies should be a server that will stick around.

    For code, you can create a repo on github or google code. I copy things to Dropbox, and since I develop in a VM, I frequently make a copy outside the VM. Most of these copies become obsolete quickly. I've posted some stuff on here. Occasionally, I'll email things to myself, and then there's a copy on the mail server. In the end, all I care about is the stuff that works, and I only want one copy -- the latest. If I end up losing that, hopefully I'll at least find one that's close. The amount of code that I write that I consider truly precious is always just a small amount of data, so it's cheap to make copies.

    So my backup strategy doesn't look like much in action, it's more like a randomized algorithm. But randomized algorithms work well in practice. Pay attention to which tiny fraction of data is the really important part, and copy it everywhere.

    You might generate more copious amounts of code than I do, particularly if you have production-level projects that are ongoing. git performs checksums and tracks history, so having a git repo on something like github would accomplish a lot of your goals. Then, periodically, you might want to generate tarballs and copy them to your email account and maybe Dropbox. Use zpaq, too. If it's an important milestone, the more copies, the better.
    Last edited by nburns; 5th September 2013 at 06:19.

  6. #6
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Location
    texas
    Posts
    449
    Thanks
    23
    Thanked 14 Times in 10 Posts
    The best backups I had was when I worked for the government. They made someone else backup my stuff because I was incapable of saving it. I use to write code fast for many machines but once I did it. It was someone elses problem. Since leaving government I have tried over the years to back stuff up I always seem to fail. In the old days I could just rewrite it a second time. My recent stuff is either here on this site or mark nelsons old site. When I get the urge to write I sometimes down load my old stuff and edit it. I don't plan to write code more than 2 or 3 more years. I am looking for an interesting problem to think about like bijective compression with a different method or another BWTS. Lately I think more about code than actually sitting down writting it. I was hoping mankind would start moving to the far reaches of space before I pass on. But the US can't even launch its big atlas rockets with out russian engines. The new middle ages is rapidly approaching thank you liberals for destroy the future.

  7. #7
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by biject.bwts View Post
    I don't plan to write code more than 2 or 3 more years. I am looking for an interesting problem to think about like bijective compression with a different method or another BWTS. Lately I think more about code than actually sitting down writting it.
    What else takes up your time that's better than working on code? I keep returning to the thought that there must be other bijective BWTs, maybe ones that are simpler and faster than the BWTS, but they always seem just out of reach. It would be useful if you found one that can be computed as quickly as the regular BWT.

    I was hoping mankind would start moving to the far reaches of space before I pass on. But the US can't even launch its big atlas rockets with out russian engines. The new middle ages is rapidly approaching thank you liberals for destroy the future.
    I'm afraid that there just aren't that many promising destinations for space travel after the moon. Space is just too big. I think we could probably get astronauts to Mars, but I'm less confident about the return trip. I think the astronauts would have to manufacture their own rocket fuel on Mars, which just sounds kind of crazy. After Mars, it only gets harder to find plausible targets for exploration.

    NASA doesn't have the budgets it once had (although I really appreciate the unmanned science missions they're doing), but other countries have space programs, and the private sector is getting involved. Some people think the private sector will far exceed NASA's accomplishments and take space exploration to the next level, so depending on how you look at it, these could be exciting times. Buzz Aldrin feels that way.

    The Hubble telescope is probably at least as significant as was the moon landing. I'll be happy as long as they keep doing science missions.
    Last edited by nburns; 5th September 2013 at 08:37.

  8. #8
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Location
    texas
    Posts
    449
    Thanks
    23
    Thanked 14 Times in 10 Posts
    You must be the a half full glass of water kind of guy and its good to have some of your kind around.

    I take care of my nieces infant change the diapers and watch barney my niece in school and
    not married its common around here. I guess my view of the world is colored in last few
    years by how under educated our children are. And by the fact I see our country racing off a
    cliff. In some ways its good NASA does not have the same budgets it had before since like
    everything else the government spoiled it with to many bureaucrats to allow it to function
    the way it should.

    The Hubble Telescope is a good example of glass half full and half empty. It cost way more than it
    should have. The lens where cut to the wrong shape. People knew it was the wrong shape but
    because of the bureaucrats in charge we launched crap. And then it was such an embarrassment
    we had to spend millions to somewhat correct it so that it was not just floating space junk. It would
    have been much much better if the dam lens where made right the first time. Yes it did advance
    science I am just saying it would have been much better the correction lens did not make it function
    nearly as well as it would have if it was done right in the first place.

    I think my distaste for NASA peaked during the shuttle flights. Do you really think the bureaucrats cared
    about trying to prevent the accident where we burned up a few people on reentry. I think the glass is
    actually 99% empty.

  9. #9
    Member
    Join Date
    Jun 2009
    Location
    Kraków, Poland
    Posts
    1,471
    Thanks
    26
    Thanked 120 Times in 94 Posts
    Getting back on topic:
    ZPAQ indeed seems an interesting options. Shame on me that I haven't focused on it. I would need to pair it with MultiPar or something like that to suit my needs, however.

    One of the problems when doing backups is that doing backup and doing work can't be done simultenaously without doing some sort of a snapshot. File systems like ZFS or Btrfs support snapshots and probably I need to look into it (unfortunately Btrfs isn't yet fully functional and stable, especially the RAID5/RAID6 support).

    Anyway, thanks for opinions!

    BTW:
    I've recenlty watched a video which gives even more wisdom about how to secure our data: http://www.youtube.com/watch?v=U4oB28ksiIo (defcon)

  10. #10
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Piotr Tarsa View Post
    One of the problems when doing backups is that doing backup and doing work can't be done simultenaously without doing some sort of a snapshot. File systems like ZFS or Btrfs support snapshots and probably I need to look into it (unfortunately Btrfs isn't yet fully functional and stable, especially the RAID5/RAID6 support)
    Have you ever used git? git is basically how Linus solved 75% of these problems, in addition the problems related to having multiple people working on the same code. I don't use it much at home, but I used it at work for my last two jobs. The reason I don't use it much at home is because I start lots of little projects and abandon most of them. So I don't want to add any friction to the process of trial and error when something's just an experiment. But it's good once a project becomes viable and worth sharing.

    With git you would probably start a repository on something like github, and then work in your local copy and push changes whenever you're done or want a backup. git is really fastidious about tracking every change and making sure everything has a SHA-1 hash. The SHA-1 hash serves as the checksum. Linus wanted to make sure his code would be error-free at least until the end of the universe. Every cloned repo is a complete backup of the project, and whatever service you use, like github, will back up that repo for you.
    Last edited by nburns; 5th September 2013 at 21:30.

  11. #11
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by biject.bwts View Post
    You must be the a half full glass of water kind of guy and its good to have some of your kind around.
    It depends.

    In some ways its good NASA does not have the same budgets it had before since like
    everything else the government spoiled it with to many bureaucrats to allow it to function
    the way it should.
    NASA seems to be one of the most well-run government agencies, ever. It got people safely to the moon multiple times, and even lately it pulled off an amazing feat by landing a science instrument as heavy as a truck on Mars in full working order. The Beagle 2 was only 33 kg, and it crashed on the way down. Which just illustrates that landing on Mars isn't simple. NASA still gets a lot done, even with a budget that's half what it was in the 60s, and most recently peaked 20 years ago. (http://en.wikipedia.org/wiki/Budget_of_NASA) Compare that to the DoD.

    The Hubble Telescope is a good example of glass half full and half empty. It cost way more than it
    should have. The lens where cut to the wrong shape. People knew it was the wrong shape but
    because of the bureaucrats in charge we launched crap. And then it was such an embarrassment
    we had to spend millions to somewhat correct it so that it was not just floating space junk. It would
    have been much much better if the dam lens where made right the first time. Yes it did advance
    science I am just saying it would have been much better the correction lens did not make it function
    nearly as well as it would have if it was done right in the first place.
    Whatever it took to get it up and running, it's been spectacular for astronomy. They had to fix it in space, which is amazing, and after that, it worked great.

    I think my distaste for NASA peaked during the shuttle flights. Do you really think the bureaucrats cared
    about trying to prevent the accident where we burned up a few people on reentry. I think the glass is
    actually 99% empty.
    Launching rockets is kind of risky any way you look at it. The shuttle is said to have been a dangerous design from the beginning. You would like to have all the explosive rocket fuel behind you, and not have essential delicate parts like wings in the back where they can get hit with debris. Plus, the belief was that it would be cheap to reuse the vehicle, instead of building a new one each time. It didn't turn out that way. The fact that just getting into orbit requires such a delicate contraption, whose problems were never solved, tells me that colonizing the galaxy is probably a long way off, infinitely long.

  12. #12
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    Robots are getting smarter as their computer brains become more powerful. There is no reason to send humans into space any more. We sent humans to the moon in 1969 because computers were too primitive then. Now, all of the best space science comes from unmanned missions like the Hubble telescope and Mars landers. The international space station is a huge waste of money that has resulted in very little useful science, just a tourist destination if you have $35 million to spare.

  13. #13
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Matt Mahoney View Post
    Robots are getting smarter as their computer brains become more powerful. There is no reason to send humans into space any more. We sent humans to the moon in 1969 because computers were too primitive then. Now, all of the best space science comes from unmanned missions like the Hubble telescope and Mars landers. The international space station is a huge waste of money that has resulted in very little useful science, just a tourist destination if you have $35 million to spare.
    I think the moon mission had more to do with symbolism than science. It was an inspiring achievement. I generally agree, except that we might need manned missions someday for something like the missions to fix the Hubble. If we stopped doing manned missions now, it would get harder to do one when we need to.

  14. #14
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    Fixing Hubble was probably the only time that a space shuttle mission was worthwhile. Not sure how long it will be before we can do it with robots.

    I think if people want to fly into space, they should pay their own way.

  15. #15
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Matt Mahoney View Post
    Fixing Hubble was probably the only time that a space shuttle mission was worthwhile. Not sure how long it will be before we can do it with robots.

    I think if people want to fly into space, they should pay their own way.
    We also needed the shuttle to launch Hubble, IIRC, because I have read that the shuttle had the greatest capacity of any vehicle, and Hubble was large and heavy.

  16. #16
    Member
    Join Date
    Jun 2013
    Location
    USA
    Posts
    98
    Thanks
    4
    Thanked 14 Times in 12 Posts
    I backup mainly to the cloud since it tends to be more organized that way. There are security implications to this(cloud provider has access to plaintext) but it's fine for most of my things. In cases where I want/require ciphertext, TrueCrypt tends to work well with that.

    I also backup all over the place: GitHub, Google Music, Dropbox, MEGA, and a few others. I like MEGA a lot for their security but their lack of synching kind of sucks.

    As far as local backups are concerned, I use 7-Zip to compress all of the needed data and then store it on a hard-drive. I'll probably switch to using zpaq for the journaling aspect of it. Well, when I need to...

  17. #17
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Location
    texas
    Posts
    449
    Thanks
    23
    Thanked 14 Times in 10 Posts
    Quote Originally Posted by Mangix View Post
    I backup mainly to the cloud since it tends to be more organized that way. There are security implications to this(cloud provider has access to plaintext) but it's fine for most of my things. In cases where I want/require ciphertext, TrueCrypt tends to work well with that.

    I also backup all over the place: GitHub, Google Music, Dropbox, MEGA, and a few others. I like MEGA a lot for their security but their lack of synching kind of sucks.

    As far as local backups are concerned, I use 7-Zip to compress all of the needed data and then store it on a hard-drive. I'll probably switch to using zpaq for the journaling aspect of it. Well, when I need to...
    I don't believe that any commercial encryption program is secure from the three letter agencies. The guy hiding in Russia leaked enough proof to show that. The government has a heavy behind the scene hand in encryption that is available. It also likely has software installed in most peoples computers to keep track of such things. If you need to encrypt something I would do it on another computer not connected to the net create a dvd full of random data. That's the hard part then when you want to encrypt take a file to this other computer and do XOR of you file with a portion of your random data only you know the starting point then a BWTS followed by another pass of XOR with the random data starting at a second spot only you now of. Repeat this as often as you like. But remember you have to remember the staring offsets and number of times you do a BWTS. This method will not change the length of the file encrypted so if you what more security you could add parts of the random data to your file just remember off sets and starting points you chose during the encryption on the other computer not on the net.
    At least this is my view of how to encrypt data securely in the modern world.

  18. #18
    Expert
    Matt Mahoney's Avatar
    Join Date
    May 2008
    Location
    Melbourne, Florida, USA
    Posts
    3,255
    Thanks
    306
    Thanked 778 Times in 485 Posts
    It's not that encryption is broken, or otherwise the NSA wouldn't need to use backdoors and secret court orders to read encrypted internet traffic.
    http://www.newyorker.com/online/blog...ncryption.html
    http://www.newyorker.com/online/blog...r-secrets.html

  19. #19
    Member biject.bwts's Avatar
    Join Date
    Jun 2008
    Location
    texas
    Posts
    449
    Thanks
    23
    Thanked 14 Times in 10 Posts
    Quote Originally Posted by Matt Mahoney View Post
    It's not that encryption is broken, or otherwise the NSA wouldn't need to use backdoors and secret court orders to read encrypted internet traffic.
    http://www.newyorker.com/online/blog...ncryption.html
    http://www.newyorker.com/online/blog...r-secrets.html
    The problem is one does not know what encryption is broken or which is not. You can only trust yourself and then only on
    a machine that is no connected to the net and even then you can't be 100% safe.


    They use back doors and such for instant access. But even if say PGP is broken and it takes a few hours to get the message on a super computer. There are far to many messages on the net to look at them all. That's why they use back doors its just to much work and time to attack every encrypted message even if known break exist.

  20. #20
    Member
    Join Date
    Jun 2013
    Location
    USA
    Posts
    98
    Thanks
    4
    Thanked 14 Times in 12 Posts
    I don't really believe that most open source stuff is backdoored in some way to gain access to the NSA. Certainly the NSA's been trying to get several bad algorithms standardized(Dual_EC_DRBG comes to mind) but most algorithms are fine I think. Certainly there are implementation issues in most stuff(timing leaks in AES as a result of S-box use) and backdoors in several commercial applications but the open source stuff looks fine for the most part.

    My post was more about scenarios in which encryption is not necessary(music or code) but cases where it is, MEGA seems to be the thing to use. It's open source(JavaScript) and content gets encrypted client-side instead of server-side. So no plaintext leaks.

    As far as client-side encryption, many things can be done to thoroughly scramble the plaintext into ciphertext. My main problem is that it's way too cumbersome. Especially when you have to manage multiple keys for multiple stuff. That's one reason why I like approaches like MEGA. The only way plaintext can be leaked is if they send backdoored JavaScript which is somewhat difficult to do if you use the browser extension.

    One problem I have with the leaks is that the information so far has been pretty generic. Sure some companies have been named(Microsoft, Google) but the stories are not specific about how the backdoors are implemented.

  21. #21
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by biject.bwts View Post
    I don't believe that any commercial encryption program is secure from the three letter agencies. The guy hiding in Russia leaked enough proof to show that. The government has a heavy behind the scene hand in encryption that is available. It also likely has software installed in most peoples computers to keep track of such things. If you need to encrypt something I would do it on another computer not connected to the net create a dvd full of random data. That's the hard part then when you want to encrypt take a file to this other computer and do XOR of you file with a portion of your random data only you know the starting point then a BWTS followed by another pass of XOR with the random data starting at a second spot only you now of. Repeat this as often as you like. But remember you have to remember the staring offsets and number of times you do a BWTS. This method will not change the length of the file encrypted so if you what more security you could add parts of the random data to your file just remember off sets and starting points you chose during the encryption on the other computer not on the net.
    At least this is my view of how to encrypt data securely in the modern world.
    That's called a one-time pad. You generate random bits and then XOR your message with the random bits. You only need to do this once (assuming the random bits are 100% secure, such as from a natural source), and your data is impossible to retrieve without the one-time pad. The problem is that the key (the random bits) is as large as the data, and you need to send it securely to the party you're communicating with (such as by a trusted courier). It's perfect, but inefficient.

    You don't have to fear that any algorithm has backdoors in it, as long as you find the algorithm in the cryptology literature and implement it yourself. For instance, you could read about RSA and check the math yourself (in principle) and do a clean implementation of it. I haven't kept up on the developments, so maybe someone found a weakness in RSA, but the government doesn't have the ability to control all the academic cryptologists, so you can trust the literature.

    Blum-Blum-Shub is a secure random number generator that could be used for secret-key encryption. I had Lenore Blum as professor for a class, and I think she's trustworthy. I didn't see her being followed by NSA agents.

  22. #22
    Member
    Join Date
    Jun 2013
    Location
    USA
    Posts
    98
    Thanks
    4
    Thanked 14 Times in 12 Posts
    Quote Originally Posted by nburns View Post
    That's called a one-time pad. You generate random bits and then XOR your message with the random bits. You only need to do this once (assuming the random bits are 100% secure, such as from a natural source), and your data is impossible to retrieve without the one-time pad. The problem is that the key (the random bits) is as large as the data, and you need to send it securely to the party you're communicating with (such as by a trusted courier). It's perfect, but inefficient.
    Or a stream cipher. The key difference is that a stream cipher is much easier to use as the pseudorandom stream is determined by the key and usually a nonce.

  23. #23
    Member
    Join Date
    Feb 2013
    Location
    San Diego
    Posts
    1,057
    Thanks
    54
    Thanked 71 Times in 55 Posts
    Quote Originally Posted by Mangix View Post
    Or a stream cipher. The key difference is that a stream cipher is much easier to use as the pseudorandom stream is determined by the key and usually a nonce.
    I think you're talking about a secret-key cipher. The key is smaller than the data and comes from a secret source. Then you compute some cryptographically-hard trapdoor function of that key to generate enough pseudorandomness to encrypt the whole message.

    David was talking about generating a whole DVD full of random bits and XORing it 1-to-1 with the data. That's known to give you perfect encryption, at the cost of being impractical.

    Edit: Ok, I see that stream cipher is a type of secret-key (or symmetric-key) encryption that sounds a lot like what David was describing. But using one of those algorithms requires trusting the algorithm.
    Last edited by nburns; 10th September 2013 at 04:05.

  24. #24
    Member
    Join Date
    Jun 2013
    Location
    USA
    Posts
    98
    Thanks
    4
    Thanked 14 Times in 12 Posts
    Even if you generate random data, you have to trust your source of randomness.

    It's all a matter of trust. Luckily, some algorithms are really simple to understand(like ChaCha's core permutation).

  25. #25
    Member
    Join Date
    Feb 2013
    Location
    ARGENTINA
    Posts
    81
    Thanks
    220
    Thanked 26 Times in 18 Posts
    I use Zpaq for my WIPs (to keep old versions) and Winrar for individual folders (light software and final projects versions).
    Sometimes i use Norton Ghost for backup entire drives and partitions.


    Backups are stored in DVD.


    Greetings.

  26. #26
    Member Karhunen's Avatar
    Join Date
    Dec 2011
    Location
    USA
    Posts
    91
    Thanks
    2
    Thanked 1 Time in 1 Post
    Anybody use strarc from http://www.ltr-data.se ? I usually only use CloneZilla to backup the disk with zero compression, and use Freearc

Similar Threads

  1. Test set: backup
    By m^2 in forum Data Compression
    Replies: 1
    Last Post: 23rd October 2008, 22:16

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •