Remote file copy util (like rsync) but that will take account of data already copied (in this sessio

Posted by Rory McCann on Server Fault See other posts from Server Fault or by Rory McCann
Published on 2010-01-21T16:40:55Z Indexed on 2010/06/14 14:13 UTC
Read the original article Hit count: 258

Filed under:
|
|
|

Let's say I have a directory with 2 files, both are identical and quite large (e.g. 2GB ea.) I want to rsync that directory to a remote host. As I understand it (and I could be wrong), rsync calculates checksums of files. Surely if it sees 2 files with the same checksum it can just copy the first file, then do a local copy on the remote host for the 2nd file? That would make it faster, no?

On a similar note, doesn't rsync hash all the remote files before copying? If it saw a different file with the same hash as a file that was to transfered, it could do a local copy on the remote host.

Does rsync support this sort of thing? Is there some way to turn it on? Is there a tool similar to rsync that will do this sort of 'hash based' local copies?

© Server Fault or respective owner

Related posts about linux

Related posts about rsync