Concurrent modification during backup: rsync vs dump vs tar vs ?
Posted
by pehrs
on Server Fault
See other posts from Server Fault
or by pehrs
Published on 2010-04-17T10:44:42Z
Indexed on
2010/04/17
10:53 UTC
Read the original article
Hit count: 303
I have a Linux log server where multiple applications write data. Data is written in bursts, and in a lot of different files. I need to make a backup of this mess, preferably preserving as much coherence between the file versions as possible and avoiding getting truncated files. Total amount of data on the server is about 100Gb. What I really would want (but can't) is to shut-down, backup the system cold and then start it up again.
What kind of guarantees against concurrent modification does the various backup tools give? When do they "freeze" the file versions? I am looking at rsync, dump and tar at the moment, but I am open for other (open source) alternatives.
Changing the application or blocking writing for backups is sadly not an option. System is not running LVM (yet), but I have considered that for rebuilding the system and then snapshots.
© Server Fault or respective owner