Problem with script that excludes large files using Duplicity and Amazon S3

Posted by Jason on Super User See other posts from Super User or by Jason
Published on 2011-01-13T21:48:49Z Indexed on 2011/01/13 21:55 UTC
Read the original article Hit count: 303

Filed under:
|
|
|
|

I'm trying to write an backup script that will exclude files over a certain size.

If i run the script duplicity gives an error. However if i copy and paste the same command generated by the script everything works...

Here is the script


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="gpgkey"

# exclude files over 100MB
exclude ()
{
 find /home/jason -size +100M \
 | while read FILE; do 
  echo -n " --exclude "
  echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ "
 done
}

echo "Using Command"
echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST"

duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

When the script is run I get the error;


Command line error: Expected 2 args, got 6

Where am i going wrong??

© Super User or respective owner

Related posts about backup

Related posts about bash