I have a file server, which is in-charge of pulling a folder tree from multiple workstations on a daily basis. My current method for this is by using rsync, (which works pretty well provided directory names and/or files remain the same) however, when files are renamed or moved about within subdir1, rsync will copy them over to the server, creating duplicates.
I have to manually find and delete extraneous files/folders that had been left on the server during previous syncs. Note that I cannot use rsync's --delete flag because any sync from a workstation will then mirror that particular folder tree, instead of merging them to the server.
Visual diagram:
Server: Workstation1 Workstation2 Workstation(n)
Folder* Folder* Folder* Folder*
-subdir1 -subdir1 -subdir1 -subdir(n)
-file1 -file1 -file2 -file(n)
-file2
-file(n)
Is there a simple script (preferably in bash, nothing fancy) that can accomplish the deletion of the extraneous files/folders in the event a file is renamed or moved to a different subdir?
Is there a different program, much like rsync that can accomplish this task autonomously and in a much simpler manner? I have looked at unison, but I did not like the fact that it keeps a local database for the syncing info.
Any tips at all as to how I am supposed to tackle this?
Thank you in advanced for your help.
EDIT:
I have tried unison just recently and I can safely say it is out of the question now.
unison is a bi-directional synchronization tool and from my testing, it mirrors the files existing on the server to all workstations. - This is unwanted.
preferably, i would want files/folders to stay within their respective workstations and just merge to the server. AKA uni-directional sync; but with renames/moves propagated to the server.
I might have to look into Git/Mercurial/Bazaar as mentioned by kyle,
but still unsure if they are fit for the job.