How to avoid duplicates when copying files that have been renamed at the destination

Posted by Benoitt on Super User See other posts from Super User or by Benoitt
Published on 2011-11-14T16:05:41Z Indexed on 2011/11/15 1:58 UTC
Read the original article Hit count: 183

Filed under:
|
|
|
|

I have to get pictures from a folder – with subfolders which are updated automatically – with their extensions.

These files have to be copied in a folder where a website based on PHP will edit them (by renaming and creating an XML file) to be downloadable and integrated in an XML feed.

Because of the rename function of the script, when I perform the copy gain, all the files are duplicated, because the script has renamed the original ones already.

I've tried a few things with rsync but I'm looking for something more powerful because I can't copy files with an external "history".

#!/bin/bash
find  '/home/name/picture' -name '*.jpg' | while read FILE ; do rsync --backup  --backup-dir=incremental --suffix=.old  "$FILE" /var/www/media ; done
wget --spider 'http://myscript.php' ; 
#exit 0

PS: As a little addition, I'd like to replace '.' with a 'space' just after the *.jpeg copy. My PHP script has some problem to define files with comma because of the extension. I'm finking about a command with find – like I did before – with a sed function? Is that a good idea?

© Super User or respective owner

Related posts about unix

Related posts about shell