Remote file copy util (like rsync) but that will take account of data already copied (in this sessio
- by Rory McCann
Let's say I have a directory with 2 files, both are identical and quite large (e.g. 2GB ea.) I want to rsync that directory to a remote host. As I understand it (and I could be wrong), rsync calculates checksums of files. Surely if it sees 2 files with the same checksum it can just copy the first file, then do a local copy on the remote host for the 2nd file? That would make it faster, no?
On a similar note, doesn't rsync hash all the remote files before copying? If it saw a different file with the same hash as a file that was to transfered, it could do a local copy on the remote host.
Does rsync support this sort of thing? Is there some way to turn it on? Is there a tool similar to rsync that will do this sort of 'hash based' local copies?