Home » Questions » Computers [ Ask a new question ]

Scripting compression (ZIP or TAR) to be done in chunks

Scripting compression (ZIP or TAR) to be done in chunks

I'm trying to find a utility or approach out there that will allow me to compress an entire directory into chunks. I know it's easy to specify, for example, that the archive files created should be exactly X size or smaller, but the archive utilities usually make it so you need all the archive files to open the archive, and that's what I'm trying to avoid. I need to be able to specify a maximum size of an archive file and it adds files into it until it's going to run out of space on the next file so it starts a new archive file. That way the archive files are technically independent of each other.

Asked by: Guest | Views: 169
Total answers/comments: 1
Guest [Entry]

"I'm assuming you are backing up a big directory onto a stack of CDs, and you want to be able to pull a file off a CD by sticking in 1 CD (rather than needing to put in 2 or more CDs out of the multi-CD archive).

Perhaps the simplest way to meet your requirement is to individually compress each file into its own little "".zip"" file, and then copy those compressed files to the CDs.
(There's a way to store the sub-sub-sub-directory that the original file came from in the "".zip"" file, so when you restore that file, it gets put back into the proper location, even though all the "".zip"" files on disk are in one long list in a single directory).

Once you have a list of zip files, you could start copying from the top of the list, and when the CD is full, eject and resume copying from that point on the list with the next CD.
That leaves a little ""wasted space"" at the end of each CD.
Some people, if they get to a small file further down the list that fits in that space, will go back and put that little file into that otherwise wasted space.
A few people obsessively attempt to re-arrange which file goes on which disk in order to pack them all as full as possible.

This approach -- independently compressing each file -- sacrifices some disk space in order to gain a little convenience."