Search Results

Search found 698 results on 28 pages for 'rsync'.

Page 17/28 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • Synology DS210j & online backup: what do people recommend?

    - by Dean
    I've just purchased a Synology DS210j for my home network and would like to backup this NAS online. I noticed that DiskStation Manager v2.3 provides various options including Amazon S3 and rsync: Does anybody have some real usage against cost statistics for Amazon's S3 service? How is sensitive data protected on Amazon S3? Are there any rsync online backup options? If so, what do people recommend? UPDATE: I am still unable to find any decent answers to the above questions, can anybody help me out?

    Read the article

  • SSH - using keys works, but not in a script

    - by Garfonzo
    I'm kind of confused, I have set up public keys between two servers and it works great, sort of. It only works if I ssh manually from a terminal. When I put the ssh command into a python script, it asks me for a password to login. The script is using rsync to sync up a directory from one server to the other. manual ssh command that works, no password prompt, automatic login: ssh -p 1234 [email protected] In the Python script: rsync --ignore-existing --delete --stats --progress -rp -e "ssh -p 1234" [email protected]:/directory/ /other/directory/ What gives? (obviously, ssh details are fake)

    Read the article

  • How to backup in Fedora-13?

    - by Ramy
    I just bought a 1.5T HDD and a disk enclosure. I connected this disk to my laptop via the provided USB cable. I then used the following command: rsync -r -t -v --progress --delete -c -l -z / /media/C4E41A11E41A0678/Moonface_BKP/ I ran this for a long (long long) time when i noticed that what had been backed up to the HDD began to be backed up. In other words, when i ran the command, it created a /media directory and a C4... directory below that and kept recursively backing up this directory (since, I suppose, I was backing up the hard drive itself, too). So...what's the proper way to use rsync?

    Read the article

  • secure synchronization of large amount of data

    - by goncalopp
    I need to automatically mirror a large amount (terabytes) of files in two unix machines over a slow link (1 Mbps). This needs to be done frequently, but the data doesn't change too much (delta transmission doesn't saturate the link). The usual solution would be rsync, but there's an additional requirement: it's undesirable, from a security standpoint, that either the source or destination machines have (keyless) ssh keys to each other, or any kind of filesystem access. All communication between the two machines should thus be initialized (and mediated) through a third machine. I've asked a separate question about rsync in particular here. Are there other obvious solutions I'm missing?

    Read the article

  • Best way to use my windows box as a backup?

    - by DFischer
    I put a 1.5 TB HD in my Windows 7 box and my main computer is a MBP. I have a lot of professional files/folders in a FireWire 800 external HD connected to the MBP and I want to use the 1.5 TB HD in my Windows 7 box as a backup for both the HD and MBP. Right now I am just copying files manually to the HD over the network and that's very slow and open to failure (not rsync'd.) Anyone suggest some appropriate solutions? Should I just figure out how to setup RSync on the windows box or is there a better alternative? Thanks!

    Read the article

  • Best way to use my windows box as a backup?

    - by user29336
    I put a 1.5 TB HD in my Windows 7 box and my main computer is a MBP. I have a lot of professional files/folders in a FireWire 800 external HD connected to the MBP and I want to use the 1.5 TB HD in my Windows 7 box as a backup for both the HD and MBP. Right now I am just copying files manually to the HD over the network and that's very slow and open to failure (not rsync'd.) Anyone suggest some appropriate solutions? Should I just figure out how to setup RSync on the windows box or is there a better alternative? Thanks!

    Read the article

  • Crontab -e gives me error messages

    - by DNA
    I get a bunch of error messages when I run crontab -e Here are the error messages. And here is my crontab file under `/usr/bin/': # /etc/crontab: system-wide crontab # Unlike any other crontab you don't have to run the `crontab' # command to install the new version when you edit this file # and files in /etc/cron.d. These files also have username fields, # that none of the other crontabs do. SHELL=/bin/sh PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin # m h dom mon dow user command 17 * * * * root cd / && run-parts --report /etc/cron.hourly 25 6 * * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily ) 47 6 * * 7 root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly ) 52 6 1 * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly ) 30 * * * * root rsync /home/dnaneet/Downloads/*.pdf /home/dnaneet/Downloads/pdfs/ # I notice that the last task ('rsync') NEVER RUNS! Why is this happening? What did I do wrong? Running Ubuntu 11.10/Bash. I have read this... Am I missing a shebang? And I don't know if my anacron jobs run. Edit 1 In light of Masi's comment, I commented out lines 17 thru 25 of my crontab file with #. Now when I run sudo crontab -e, all I get is: /usr/bin/crontab: 11: 17: not found /usr/bin/crontab: 12: 25: not found (gedit:4301): Gtk-WARNING **: Attempting to store changes into `/root/.local/share/recently-used.xbel', but failed: Failed to create file '/root/.local/share/recently-used.xbel.GOHVBW': No such file or directory (gedit:4301): Gtk-WARNING **: Attempting to set the permissions of `/root/.local/share/recently-used.xbel', but failed: No such file or directory What in the world?

    Read the article

  • Bash script for mysql backup - error handling

    - by Jure1873
    I'm trying to backup a bunch of MyISAM tables in a way that would allow me to rsync/rdiff the backup directory to a remote location. I've came up with a script that dumps only the recently changed tables and sets the date of the file so that rsync can pick up only the changed ones, but now I don't know how to do the error handling - I would like the script to exit with a non 0 value if there are errors. How could I do that? #/bin/bash BKPDIR="/var/backups/db-mysql" mkdir -p $BKPDIR ERRORS=0 FIELDS="TABLE_SCHEMA, TABLE_NAME, UPDATE_TIME" W_COND="UPDATE_TIME >= DATE_ADD(CURDATE(), INTERVAL -2 DAY) AND TABLE_SCHEMA<>'information_schema'" mysql --skip-column-names -e "SELECT $FIELDS FROM information_schema.tables WHERE $W_COND;" | while read db table tstamp; do echo "DB: $db: TABLE: $table: ($tstamp)" mysqldump $db $table | gzip > $BKPDIR/$db-$table.sql.gz touch -d "$tstamp" $BKPDIR/$db-$table.sql.gz done exit $ERRORS

    Read the article

  • Empty /var/log after running cron bash script

    - by Ortix92
    I wrote a little bash script and all of a sudden my /var/log folder is completely empty except for the log I created for the bash script. This is the script I'm running every hour with cron: #!/bin/bash STL_DIR=/path/to/some/folder/i/hid LOGFILE=/var/log/stl_upload.log now=`date` echo "----------Start of Transmission----------" 2>&1 | tee -a $LOGFILE echo "Starting transfer at $now" 2>&1 | tee -a $LOGFILE rsync -av -e ssh $STL_DIR [email protected]:/users/path/folder 2>&1 | tee -a $LOGFILE echo "----------End of transmission----------" 2>&1 | tee -a $LOGFILE printf "\n" 2>&1 | tee -a $LOGFILE I want to be clear that I'm not 100% certain this is related to the empty logs folder. So if anyone could give me a pointer as to what could be going on about the reason why my log folder is empty, that'd be great.

    Read the article

  • HP-UX -> Linux incremental remote backup

    - by stack_zen
    Hi. I've the need to setup a differential backup process from a range of remote HP-UXes to a central RHEL5 server. I'd happily go with rsync, problem is, stock HP-UX 11.11 has no built-in rsync and I don't have permissions to install any software on the remote stock HP-UXes. How should I approach this? HP-UX provides: fbackup (HP-UX exclusive) cpio (available in RHEL5, allows backing up only the files which changed, but always grabs the totality of the file) ssh remote_user@remote_host 'find /u01/engine/logs/ -type f -name "*.log" | cpio -o | gzip -' | cpio gunzip - | -idmv Those solutions don't really answer my incremental (bandwidth efficiency) problem do they?

    Read the article

  • How to auto-mount a copied encrypted home

    - by LedZ
    How can I auto-mount and use my encrypted home that I copied to another partition on the same hard disk? I'm running Ubuntu 11.10. My encrypted home is on sda1. There I've 2 users: userA and userB. Another partition is sda3 on which I have some other Data. BTW, sda1 is formatted as EXT4, sda3 is formatted as EXT3. I did the following: I logged out from GUI (Gnome) and changed (using Ctrl+Alt+F1) to the shell. From there I logged in, changed to sudo (using sudo -s) . After then I created a new mountpoint (tmp) under /mnt (mkdir /mnt/tmp) mounted /dev/sda3 on that mountpoint /mnt/tmp (mount /dev/sda3 /mnt/tmp) copied my encrypted /home to /mnt/tmp using rsync (rsync --acvxASXH --progress --stats /home/ /mnt/tmp/). After the “copy-procedure” I looked to my “new home” in /mnt/tmp and there I found the following 3 folders: userA, userB, .ecryptfs My structure for /dev/sda3 mounted on /mnt/tmp looks like the following (userB in ecryptfs I've not listed): -userA ¦ +userB ¦ +.ecryptfs ¦ +userA ¦ + auto-mount ¦ + auto-umount ¦ + Private.mnt ¦ + Private.sig ¦ + wrapped-passphrase ¦ + .wrapped-passphrase.recorded ¦ + .Private + (encrypted file_1) + (encrypted file_2) + (encrypted file_n) Now I would like that this copy of the original home-directory should act with the same behavior as the original home-directory means, that it should be auto-mounted at reboot and give me access to my unencrypted files and after logout all my files should be encrypted again. Any suggestions?

    Read the article

  • Pervasive database backup

    - by Steven
    I'm looking for the best way to backup my pervasive database. I've read the documentation but still have a few questions. It appears that Continuous Operations method only allows me to backup the entire database? So I'd do butil -startbu @filelist, then backup the entire database (copy, rsync, etc), then run butil -endbu @filelist. Looking through the documentation I don't see a way to get transaction logs out of this method; like I would do for MSSQL (BACKUP LOG ACCT TO DISK) or Postgres (archive_command). With rsync, it might be feasible to still do this every 15 minutes. The Archival Logging method means I would have to occasionally stop the database to get a full backup, which is acceptable for me. But can I copy the log files off of the server every 15 minutes, ie log shipping? Thank you.

    Read the article

  • Best Practice: Migrating Email Boxes (maildir format)

    - by GruffTech
    So here's the situation. I've got about 20,000 maildir email accounts chewing up a several hundred GB of space on our email server. Maildir by nature keeps thousands of tiny a** little files, instead of one .mbox file or the like... So i need to migrate all of these several millions of files from one server to the other, for both space and life-cycle reasons. the conventional methods i would use all work just fine. rsync is the option that comes immediately to mind, however i wanted to see if there are any other "better" options out there. Rsync not handling multi-threaded transfers in this situation sucks because it never actually gets up to speed and saturates my network connection, because of this the transfer from one server to another will take hours beyond hours, when it shouldn't really take more then one or two. I know this is highly opinionated and subjective and will therefore be marked community wiki.

    Read the article

  • backuppc - how to backup remote (over the internet) clients?

    - by Scott
    I am testing out backuppc, which works great so far backing up windows clients on a LAN via SMB (no backup client/agent required). However I have quite a few laptops and desktops that are in various remote locations - some of which move around. I need some way to have that remote computer create an outgoing connection for backup purposes (Windows XP/7). I know backuppc supports smb, rsync and 'tar', but I believe these are all connections going from the server TO the client. SO, I either need a way to vpn the client on a timed basis, or it would be a lot better if the client could some how connect to the server (ssh?) and initiate it's own backup somehow (rsync?). Of course this all needs to be pre-installed by me and require no maintenance by the end user, no dialogs on their side. What do you think?

    Read the article

  • Backing up data (including mysqldumps) to S3

    - by seengee
    We have a web app on a number of servers and we want to add an additional layer of redundancy by backing up the key data to S3. The key data is the MySQL database and a folder containing dynamically created site assets - predominantly images. Some kind of rsync based solution would initially seem the best plan. A couple of years ago we played with S3cmd (in particular s3cmd sync) with some success but we didn't find it particularly reliable although this may have changed since. Its occurred to me though that a rsync solution might not work particularly well with a single db.sql file created with mysqldump and I assume this means the whole database getting transferred each time, with multiple databases of over 1GB this is going to add up to a lot of traffic (and $s) very quickly. With the image files I could simply just transfer files modified within the last day which would be far more simple. What approach should I look at?

    Read the article

  • How to automatically take daily HDD backup?

    - by user13107
    I think my HDD might have crashed. I don't want to lose data if this happens again. I have dual boot Windows/Ubuntu system. What is the best way to backup data and other software settings in Ubuntu? I don't care much about Windows partition, important stuff is in Ubuntu. I have 1 Tb external HDD (laptop HDD is of 500 gb total). One way would be to run rsync every day (or via cronjob) to backup everything to external HDD. What might be better ways of achieving this (backup)? Also are instant backup software recommended? Are there any disadvantages of instant backup as opposed to daily rsync?

    Read the article

  • Synchronizing 3 servers over IP

    - by user93078
    I'm setting up a medical server for a hospital that has doctors located in 3 different locations, meaning there would be 3 servers (1 in each location). All 3 servers would just have the following software: Ubuntu Server 12.04 minimal MySQL, PHP 5, Apache The medical software which would read/write to the MySQL database Remote admin apps like Nagios & Webmin Rsync for backup (rsync-over-ssh) as a cron job and the doctors at each location would access patient & billing data from their respective servers. What I'd like is, that each of these servers all have synchronized info (especially the mySQL database's) - let's say on an hourly basis each of these servers synchronize data to a common remote server and the data is then brought down to each of the servers. I know an easier way would be to have the medical app running on a remote web server, but since this is medical that we're talking about and knowing how common it is in our area for the net to go gown, I wouldn't like a web based scenatio. Is such a setup possible? Would this be the right way to do things or is there a better way to this? Would really appreciate views and comments (or how to set this up) on this.

    Read the article

  • I cannot change the grub Default item from OS-1, but I can from OS-2 (dual-boot 10.04 on both)

    - by fred.bear
    My 10.04 system (OS-1) got into a tangle the other day, so I installed a second, dual-boot 10.04 (OS-2), so that I could trouble-shoot the hung system... In case it is relevant to my question, I'll mention that since I got OS-1 working again, it has shown a few battle wounds from its ordeal (.. actually the ordeal was mine ... trying to figure it all out ;) ... I lost some custom settings, but not all. (For the curious: the hangup was caused by rsync writing 600 GB to OS-1's 320 GB drive.. The destination drive was unmounted at the time, and rsync dutifully wrote directly to /media/usb_back; filling it to capacity... I have since, ammended my script :) Because the dual-boot MBR was prepared by OS-2, it is first on the grub list.. However, I want OS-1 to be the default OS to boot... From OS-1, I tried two methods to change the grub-menu's defaule OS. eg. Directly editing /etc/default/grub (then update-grub) Running 'Startup Manager' (then update-grub) Neither of these methods had any effect... so I started OS-2, and tried method 1... It worked! Why can I not change the grub menu from OS-1? .. or if it can be done, How?

    Read the article

  • how to compare files/directories of 2 separate solaris boxes ?

    - by chz
    Hi Friends I have 2 solaris boxes and I need to check certain directories (on local filesystem and mounted nfs) to make sure that they match up on both boxes and to delete or move the other mismatches to elsewhere on the local filesystem. I investigated for unix commands like rsync, and tree but it appears that these commands are not supported on my Solaris boxes. What is the best approach to this problem with the least pain to solve it ? to use rsync, tree and then diff the outputs or find ? I have trouble limiting the find command to certain directories as there are mounted folders that contain too many xml files that I don't care to much in that directory. What's the find command to search multiple directory paths on a single find command. Thanks Sincerely

    Read the article

  • Any need to make backup of data on Amazon S3?

    - by Chrille
    I'm hosting 200 GB of product images at S3 (this is my primary file host). Do I need to back that data up somewhere else, or is S3 safe as it is? I have been experimenting with mounting the S3 bucket to a EC2 instance, and then making a nightly rsync backup. The problem is that it's about 3 million files, so it takes a while to generate the different rsync needs. The backup actually takes about 3 days to complete. Any ideas how to do this better? (if it's even necessary?)

    Read the article

  • Windows file Sync

    - by Deane Venske
    So I have a big problem at the moment. Trying to find a reliable solution for syncing 2 windows IIS servers. I need to keep the web content imaged on both. Now I have been trying to use Rsync to this point, but unfortunately file permission errors are a nightmare to manage this way. I'm testing out dropbox, but the performance sucks. I'm more familiar with Linux stuff and I've used Rsync in the past but isn't there a native windows solution that will work?

    Read the article

  • SSH logins failing before success

    - by Vincent
    I am running Ubuntu 12.04 Server, updated, to run a webserver on Tomcat 7. I have about 1000 clients that are very very often using an RSYNC program to sync some file with this server. Those RSync are using SSH with a certain user to open connections on the server. The result is that my server is, as normal, full of connections by the same user. About 5 connections per 1 second every day any time. Then, when I try to open a regular SSH connection with my Putty client, the connection fails before login saying "Server unexpectedly closed network connection", about 6 times for 10 attemps, anbd for 4 attemps out of 10, it works normally and I am able to login as any user. Is there a overload of connections here? The server statistics are very calm saying less then 40% of network usage and less of 2% CPU. How can I improve this? Thank you for any help. V.

    Read the article

  • ssh login with multi-keys (several users) from the same workstation

    - by user1463152
    Basically, as it comes from the title, I'd like to login from my workstation to a SSH server by using different account in different shells. What I'd like to do is to backup all my account from this server to my hard drive by using rsync, but as you know rsync can get a connection if the keys have been generated. I have already an account set with key for login without pass. What I did was to generate a key on the server and then download it on my ~/.ssh folder. I tried it with another account and I chanced the name of the key. Then I download it in my ~/.ssh...but no way to get it work. I am not an expert of this stuff. If you would provide any tips or way to set it up I would really appreciate your help. Cheers

    Read the article

  • Non-interactive command-line FTP client alternative to weex

    - by Halfgaar
    Hi, I'm looking for a good non-interactive, command line FTP client to be run from a Rakefile. Like Weex, but better. Weex has different problems (for me): It stores its config file in my home dir. I want the FTP config to be part of my project and weex doesn't have a --config-file option or something. The behavior of ignoring files seems to be completely buggy. It doesn't remove files which it should, it doesn't let me specify relative paths, even though I do it according to the man page's instructions, etc. I've been struggling with it for an hour now and it is just completely inexplicable. I tried running rsync over FTPFS/FUSE, but that is dead slow because FTP doesn't store mtimes, which makes rsync diff every file. Plus, there are some refresh problems and other bugs that cause access failure (http://bugs.gentoo.org/208168). I'm stuck with FTP, unfortunately. Any help is appreciated.

    Read the article

  • Error installing pkgconfig via macports

    - by Greg K
    I installed Macports 1.8.2 from a DMG. That seemed to install fine. I ran sudo port selfupdate to make sure my ports tree was current. I then tried to install bindfs as I want to mount some directories in my OS X file system (like you can do with mount --bind in linux). pkgconfig and macfuse are two dependencies of bindfs. I had trouble installing bindfs due to errors installing pkgconfig, so I tried to just install pkgconfig, here's the debug output from sudo port install pkgconfig: $ sudo port -d install pkgconfig DEBUG: Found port in file:///opt/local/var/macports/sources/rsync.macports.org/release/ports/devel/pkgconfig DEBUG: Changing to port directory: /opt/local/var/macports/sources/rsync.macports.org/release/ports/devel/pkgconfig DEBUG: OS Platform: darwin DEBUG: OS Version: 10.3.0 DEBUG: Mac OS X Version: 10.6 DEBUG: System Arch: i386 DEBUG: setting option os.universal_supported to yes DEBUG: org.macports.load registered provides 'load', a pre-existing procedure. Target override will not be provided DEBUG: org.macports.unload registered provides 'unload', a pre-existing procedure. Target override will not be provided DEBUG: org.macports.distfiles registered provides 'distfiles', a pre-existing procedure. Target override will not be provided DEBUG: adding the default universal variant DEBUG: Reading variant descriptions from /opt/local/var/macports/sources/rsync.macports.org/release/ports/_resources/port1.0/variant_descriptions.conf DEBUG: Requested variant darwin is not provided by port pkgconfig. DEBUG: Requested variant i386 is not provided by port pkgconfig. DEBUG: Requested variant macosx is not provided by port pkgconfig. ---> Computing dependencies for pkgconfig DEBUG: Executing org.macports.main (pkgconfig) DEBUG: Skipping completed org.macports.fetch (pkgconfig) DEBUG: Skipping completed org.macports.checksum (pkgconfig) DEBUG: Skipping completed org.macports.extract (pkgconfig) DEBUG: Skipping completed org.macports.patch (pkgconfig) ---> Configuring pkgconfig DEBUG: Using compiler 'Mac OS X gcc 4.2' DEBUG: Executing org.macports.configure (pkgconfig) DEBUG: Environment: CFLAGS='-O2 -arch x86_64' CPPFLAGS='-I/opt/local/include' CXXFLAGS='-O2 -arch x86_64' MACOSX_DEPLOYMENT_TARGET='10.6' CXX='/usr/bin/g++-4.2' F90FLAGS='-O2 -m64' LDFLAGS='-L/opt/local/lib' OBJC='/usr/bin/gcc-4.2' FCFLAGS='-O2 -m64' INSTALL='/usr/bin/install -c' OBJCFLAGS='-O2 -arch x86_64' FFLAGS='-O2 -m64' CC='/usr/bin/gcc-4.2' DEBUG: Assembled command: 'cd "/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_ports_devel_pkgconfig/work/pkg-config-0.23" && ./configure --prefix=/opt/local --enable-indirect-deps --with-pc-path=/opt/local/lib/pkgconfig:/opt/local/share/pkgconfig' checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... yes checking for gawk... no checking for mawk... no checking for nawk... no checking for awk... awk checking whether make sets $(MAKE)... no checking whether to enable maintainer-specific portions of Makefiles... no checking build system type... i386-apple-darwin10.3.0 checking host system type... i386-apple-darwin10.3.0 checking for style of include used by make... none checking for gcc... /usr/bin/gcc-4.2 checking for C compiler default output file name... configure: error: C compiler cannot create executables See `config.log' for more details. Error: Target org.macports.configure returned: configure failure: shell command " cd "/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_ports_devel_pkgconfig/work/pkg-config-0.23" && ./configure --prefix=/opt/local --enable-indirect-deps --with-pc-path=/opt/local/lib/pkgconfig:/opt/local/share/pkgconfig " returned error 77 DEBUG: Backtrace: configure failure: shell command " cd "/opt/local/var/macports/build/_opt_local_var_macports_sources_rsync.macports.org_release_ports_devel_pkgconfig/work/pkg-config-0.23" && ./configure --prefix=/opt/local --enable-indirect-deps --with-pc-path=/opt/local/lib/pkgconfig:/opt/local/share/pkgconfig " returned error 77 while executing "$procedure $targetname" Warning: the following items did not execute (for pkgconfig): org.macports.activate org.macports.configure org.macports.build org.macports.destroot org.macports.install Error: Status 1 encountered during processing. I have only recently installed Xcode 3.2.2 (prior to installing macports). Am I right in thinking this the issue here: configure: error: C compiler cannot create executables

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >