Random-access archive for Unix use

Posted by tylerl on Stack Overflow See other posts from Stack Overflow or by tylerl
Published on 2010-05-29T02:56:54Z Indexed on 2010/05/29 3:02 UTC
Read the original article Hit count: 260

Filed under:
|
|

I'm looking for a good format for archiving entire file-systems of old Linux computers.

TAR.GZ
The tar.gz format is great for archiving files with UNIX-style attributes, but since the compression is applied across the entire archive, the design precludes random-access. Instead, if you want to access a file at the end of the archive, you have to start at the beginning and decompress the whole file (which could be several hundred GB) up to the point where you find the entry you're looking for.

ZIP
Conversely, one selling point of the ZIP format is that it stores an index of the archive: filenames are stored separately with pointers to the location within the archive were to find the data. If I want to extract a file at the end, I look up the position of that file by name, seek to the location, and extract the data. However, it doesn't store file attributes such as ownership, permissions, symbolic links, etc.

Other options?
I've tried using squashfs, but it's not really designed for this purpose. The file format is not consistent between versions, and building the archive takes a lot of time and space.

What other options might suit this purpose better?

© Stack Overflow or respective owner

Related posts about linux

Related posts about file