Search Results

Search found 698 results on 28 pages for 'rsync'.

Page 6/28 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Rsync Push files from linux to windoes. ssh issue - connection refused

    - by piyush c
    For some reason I want to run a script to move files from Linux machine to Windows. I have installed cwRsync on my windows machine and able to connect to linux machine. When i execute following command: rsync -e "ssh -l "piyush"" -Wgovz --timeout 120 --delay-updates --remove-sent-files /usr/local/src/piyush/sync/* "[email protected]:/cygdrive/d/temp" Where 10.0.0.60 is my widows machine and I am running above command on Linux - CentOS 5.5. After running command I get following error message: ssh: connect to host 10.0.0.60 port 22: Connection refused rsync: connection unexpectedly closed (0 bytes received so far) [sender] rsync error: error in rsync protocol data stream (code 12) at io.c(463) [sender=2.6.8] [root@localhost sync]# ssh [email protected] ssh: connect to host 10.0.0.60 port 22: Connection refused I have modified my firewall settings on widows to allow all ports. I think this issue is due to SSH Daemon not present on my windows machine. So I tried installing OpenSSH on my machine and running ssh-agent but didn't helped. I tried similar command to run on my widows machine to pull files from Linux and its working fine. For some reason I want command for Linux machine so that I can embed it in a shell script. Can you suggest me if I am missing anything. I am already having cwRsync installed on my widows and running it in daemon mode using --damemon option. And I am able to login using ssh from windows machine to linux machine. When I issue bellow command, it just blocks for 120 seconds (timeout I specified in command) and exits saying there is timeout. rsync -e "ssh -l piyush" -Wgovz --timeout 120 --delay-updates --remove-sent-files /usr/local/src/piyush/sync/* "[email protected]:/cygdrive/d/temp" After starting rsync on widows, I checked, rsyc is running. And widows firewall setting are set to minimal, and on Linux machine stopped iptables service so that port 873 (default rsync port) is not blocked. What can be the possible reason that Linux machine is not able to connect to rsync-daemon on windows machine?

    Read the article

  • MAC OSX 10.5.8 need to save rsync password with ssh-copy-id

    - by Brady
    Hello all, I'll start by saying I'm very new to MAC but comfortable in using the command line thanks to using a linux a lot. I currently have rsync setup to run between a MAC OSX 10.5.8 server to a Linux Centos 5.5 Server. This is the command I'm running on the MAC server: rsync -avhe ssh "/Path/To/Data" [email protected]:data/ As it does it prompts for a password but I need it to save the password. After looking around I need to use: ssh-keygen -t dsa save the passkey and then move it over to the Linux server using: ssh-copy-id -i .ssh/id_dsa.pub [email protected] But ssh-copy-id doesnt seem to exist on the MAC server. How do I copy this key over? I've tried searching for the answer myself but the help seems to be all over the place for this.. Any help is greatly appreciated. Scott

    Read the article

  • How to speed up rsync?

    - by Jakobud
    I'm running rsync to sync a directory onto my external USB HDD. It's about 150 gigs of data. 50000+ files I would guess. It's running it's first sync at the moment, but its copying files at a rate of only 1-5 MB/s. That seems incredibly slow for a USB 2.0 enclosure. There are no other transfers happening on the drive either. Here are the options I used: rsync -avz --progress /mysourcefolder /mytargetfolder I'm running Ubuntu Server 9.10.

    Read the article

  • rsync over SSH with cron in osx-environment

    - by Martin
    I want to automatically download files and folders from a Linux server to which I have an SSH (and FTP) account. The files shall be downloaded on a regular basis (I suppose a cron is the right tool to do so) onto an OS X machine. I tried the following rsync command, which works fine: rsync -avzbe ssh [email protected]:/www/htdocs/something/somefolder /Users/me/folder/foo/ However I have to enter the account's password every time (the SSH account on the server machine). The server is a managed one and I'm afraid I can't change the password. Here are my questions: How do I bypass the entering of the password by storing it somewhere How do I automate this then correctly?

    Read the article

  • Trying to script rsync using pam_exec

    - by Ricky-Rose
    I'm trying to write a bash script that will execute rsync when called by pam_exec. I've tried a couple different ways, and I'm not sure what I'm doing wrong. When I try to run the script at login by adding session optional pam_exec.so /usr/bin/local/sync.sh to my sshd file, it gives me an exit code of 12. if I log in and then manually run my script, it allows me to connect to the remote server, and it lists my files, but it doesn't actually sync anything. I have tried the code below using buth $USER and $PAM_USER. $PAM_USER doesn't work at all. #!/bin/sh rsync -azv -e ssh $USER@remote_server:/home/html/$USER/ /home/html/$USER

    Read the article

  • php rsync with exec() not working

    - by mojeime
    Why this: rsync -avz -e ssh /home/userneme/folder [email protected]:/var/www/folder works from cronjob and this: exec("rsync -avz -e ssh /home/userneme/folder [email protected]:/var/www/folder"); doesn't work. I know exec is working because i have a few places in my appp that do convercion from pdf to jpg with ImageMagick (exec). SOLVED exec is working OK it was a permission issue on remote server. "Local" server is shared reseller account and remote server is my first VPS Ubuntu 10.10 LAMP box. If only I had a system administrator since i'm just a software developer forced to do this and i stink at it :) Thank You all!

    Read the article

  • Using rsync with link-dest from HFS to NTFS

    - by Tom
    Hi, I'm having a problem with rsync. I'm on a Mac and I'd like to sync my everyday's changes from my HFS+ partition to my NTFS formated networked drive. Pretty simple, and everything goes well except that it syncs every file each times. Here's my script: #! /bin/sh snapshot_dir=/Volumes/USB_Storage/Backups snapshot_id=`date +%Y%m%d%H%M` /usr/bin/rsync -a \ --verbose \ --delete --delete-excluded \ --human-readable --progress \ --one-file-system \ --partial \ --modify-window=1 \ --exclude-from=.backup_excludes \ --link-dest ../current \ /Users/tommybergeron/Desktop/Brainpad \ $snapshot_dir/in-progress cd $snapshot_dir rm -rf $snapshot_id mv in-progress $snapshot_id rm -f current ln -s $snapshot_id $snapshot_dir/current Could someone help me out please? I've been searching for like two hours and I still am clueless. Thanks so much.

    Read the article

  • rsync - how to set/keep directory permissions?

    - by Dylan
    I'm using CwRsync to connect from my Windows development machine to a linux webserver : rsync -avuz -e ./ssh --exclude=".svn" /cygdrive/c/xampp/htdocs/project123/ [email protected]:/home/user123/public_html This syncs my development project directory nicely and fast to the server. But after doing this, all directory properties are reset to the local user 'user123' only, so the website is not available anymore. I need to manually reset those properties. Why is this happening, and how to prevent it? PS. coming from a Windows environment I'm having a really hard time understanding rsync. I copied the above command from some examples... just need to get this one small thing working too...

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • Rsync over ssh with root access on both sides

    - by Tim Abell
    Hi, I have one older ubuntu server, and one newer debian server and I am migrating data from the old one to the new one. I want to use rsync to transfer data across to make final migration easier and quicker than the equivalent tar/scp/untar process. As an example, I want to sync the home folders one at a time to the new server. This requires root access at both ends as not all files at the source side are world readable and the destination has to be written with correct permissions into /home. I can't figure out how to give rsync root access on both sides. I've seen a few related questions, but none quite match what I'm trying to do. I have sudo set up and working on both servers.

    Read the article

  • rsync remote to local automatic backup

    - by Mark Molina
    Because all my work is stored on a remote server I would like to auto backup my server monthly and weekly. My server is running Centos 5.5 and while searching the web I'm found a tool named rsync. I got my first update manually by using this command in terminal: sudo rsync -chavzP --stats USERNAME@IPADDRES: PATH_TO_BACKUP LOCAL_PATH_TO_BACKUP I then prompt my password for that user and bob's my uncle. This backups the necessary files from my remote server to my local device but does somebody know how I can automate this? Like automatic running this script every sunday? EDIT I forgot to mention that I let direct admin backup the files I need and then copy those files from the remote server to a local server.

    Read the article

  • Is rsync corrupting my RAR?

    - by Mark Henderson
    We have two qnap devices - one in our datacentre and one off-site. We have hundreds of password protected RAR files stored on the qnap that contain virtual machine image snapshots, with approx 20 of them being created each day. We synchronise the two devices using rsync, and it looks like all the files are being rsynced OK - they come over and have the same file size and all the files are present and accounted for. However, when I try to open the RAR files on the remote site, I get Cannot open \\qnap01\FromDatacentre\Snapshots\DB001SQL1-20110626.rar I can open the RAR files on the local site just fine, so I assume that something is getting mangled during the rsync procedure. However, the older files (pre 2011-06-20) work just fine, it's something that's only started happening in the last week. There haven't been (as far as I know) any changes to any of the devices, setup or configuration in that time. Obviously something has changed though. Where should I start investigating?

    Read the article

  • rsync (or robocopy) from multiple computers - what happens?

    - by TheCleaner
    If a Linux server has two different rsync jobs nightly for the same folder to two different destinations, do both destinations end up with the same end set of files? Or does the first job run, and set something on the source folder/files that would cause the 2nd rsync job to not realize the daily changes/updates to the source? Same for a Windows environment using something like robocopy, or even a "differential" backup using BUE or similar. Does each "sync" compare the destination to the source and update the destination regardless of if it is synced multiple times to different destinations?

    Read the article

  • backing up ntfs disk using rsync on ubuntu

    - by user70366
    For a long time I was using windows. I have a separate drive I use to keep copies of my media files, photos etc. on, which I periodically backup to an external drive. In Windows I used SyncToy to do this. After my Windows stopped booting, I decided to switch to Linux (Ubuntu 10.10). That seems to be going fine, but now I want to backup my drive to the external drive like before. Mostly the two drives will be already the same with maybe about 10GB of extra files added. So I try to use rsync to synchronise the two drives like this: rsync --dry-run -rvlt --modify-window=1 /media/Antonio1TB/Backup /media/FREECOM\ HDD/Backup The problem is the dry run indicates that every file on the drive will be copied. Not just the files I have recently added. What is the correct command to synch two NTFS drives under Ubuntu so that files that already exist don't get copied again? Thanks.

    Read the article

  • rsync invocation to replace symlinks pointing to source?

    - by bdbaddog
    Currently I'm moving a big filesystem to a new server as the original fileserver is no longer able to handle the filesystem writes. To make this quick I made symlinks at the target filesystem pointing to the original filesystem. Initially: /company/release (mountpoint of the original filesystem) After migration: /company/release.old (points to original filesystem after automount map update) /company/release (points to new fileserver/filesystem after automount map update) In /company/release there are symlinks like the following: /company/release/product-1.0.tar.gz - /company/release.old/product-1.0.tar.gz /company/release/product-1.0 - /company/release.old/product-1.0 (this is a tree of files) Using symlinks allowed me to move the writes to the new filesystem quickly. Now I'd like to slowly migrate the existing files and directories to the new filesystem. The problem I'm running into is that since the symlinks point back at the original files rsync doesn't see any difference and so it doesn't actually copy the file(s) or directory(s) and remove/overwrite the symlinks. Is there a set of rsync flags which will do what I want?

    Read the article

  • rsync generates very much traffic

    - by user109459
    I use rsync for backing up one of my servers with 4GB of files. When I now try to transfer these files the traffic for the files isn't the estimated 4GB. It is a lot higher. It's about 60GB. I also checked the traffic on my server, backup server and router and all three say that there was a traffic of 60GB. But at the end rsync says that it only has transfered 4GB. Another problem is that I can't debugg it because the problem occures randomly.

    Read the article

  • How to rsync just the current folder?

    - by Michael
    I want to sync some files from a remote server to my local computer. How can I make rsync to just copy the files with a certain file extension in the directory but no subdirectories? I assumed this to be an easy task, but embarassingly I'm not getting it for nearly 2 hours. So could someone give me an example? I did various experiments with something like the following command: rsync -a --include=what? --exclude=what? -e ssh [email protected]:/test /test

    Read the article

  • rsync not copying hard links

    - by A.Ellett
    I have two computers (both MacBook Airs) for which I sync one directory tree in both, but not the entire hard drive or any other directories. Let's say on computer A the directory is /Users/aellett/projects Let's say on computer B the directory is /Users/bellett/projects Generally, I'll log into computer B and then remotely connect to computer A as user 'aellett'. As super user I sync the two project directories as follows: rsync -av /Volumes/aellett/projects/ /Users/bellett/projects/ and this works as expected. On both computers I have another file letter.txt in a different directory which is not getting synced. Let's say on computer A the file is found in /Users/aellett/letters On computer B the file is found in /Users/bellett/correspondence Generally, I don't want to share what's not included in /Users/<username>/projects. But I do want to share this particular file. So on both computer I made a correspondence directory in projects. And then I made hard links as follows On computer A: ln /Users/aellett/letters/letter.txt /Users/aellett/projects/correspondence/letter.txt On computer B: ln /Users/bellett/correspondence/letter.txt /Users/aellett/projects/correspondence/letter.txt The next time I synced the two computers I did the following rsync -av -H /Volumes/aellett/projects/ /Users/bellett/projects/ When I checked on computer B, /Users/bellett/projects/correspondence/letter.txt was correctly synced. But, the hardlink to /Users/bellett/correspondence/letter.txt was no longer there. In other words, /Users/bellett/projects/correspondence/letter.txt was identical to /Users/aellett/projects/correspondence/letter.txt but it differed from /Users/bellett/correspondence/letter.txt. Since these two files were hard linked on both computers, I expected them to still have the hard link. Why are my hard links not being preserved?

    Read the article

  • Using rsync to migrate files between windows servers: problems with Excel spreadsheets

    - by HorusKol
    We've been migrating about 220 GB of data from a Windows 2003 Server to a Windows 2008 Server, and because of the time it would take to copy that data and the necessity of keeping it available for users, I came up with the idea of using rsync on an Ubuntu server to broker the migration. (I might have gone for a proper Windows solution - but the applications I found were a bit pricey for a one-shot like this - and permissions are not a problem). All well and good - and today I'm making the last sync and confirming that the new server is up-to-date using diff, but I"ve noticed an odd thing with Excel spreadsheets (.xls). Every instance of an Excel spreadsheet that has already been copied in a previous in a previous synchronisation is being marked as "already up-to-date" by rsync. However, when I then run a diff, I'm told that the files differ. I'm manually copying them, as there are but a handful, but I was wondering what might be causing this. No other filetype in the entire 220 GB tree has had any problem like this - just the Excel/xls files. It'd be great if someone could come up with an explanation.

    Read the article

  • Copying a large directory tree locally? cp or rsync?

    - by Rory
    I have to copy a large directory tree, about 1.8 TB. It's all local. Out of habit I'd use rsync, however I wonder if there's much point, and if I should rather use cp. I'm worried about permissions and uid/gid, since they have to be preserved in the clopy (I know rsync does this). As well as thinks like symlinks. The destination is empty, so I don't have to worry about conditionally updating some files. It's all local disk access, so I don't have to worry about ssh or network. The reason I'd be tempted away from rsync, is because rsync might do more than I need. rsync checksums files. I don't need that, and am concerned that it might take longer than cp. So what do you reckon, rsync or cp?

    Read the article

  • rsync unicode filename error

    - by Mirage
    I am getting this error while using rsync Could not convert filename to Unicode: 'H20 dinkus_.pdf': Invalid or incomplete multibyte or wide character Could not convert filename to Unicode: 'ANT0012 H20 Brochure_OFFSET_paths_.pdf': Invalid or incomplete multibyte or wide character ntfs_mst_post_read_fixup: magic: 0x00000000 size: 1024 usa_ofs: 0 usa_count: 65535: Invalid argument What should i do

    Read the article

  • SVN Backup using rsync command

    - by user37143
    Hi, I setup a cron for taking backup of my SVN repo( 8 GB) to another server. But some times I get errors and I feel that this is not the proper way to back an svn to a remote server. I used the command rsync -avz myrepo. Please suggest me a good way to do svn backup to a remote server. I cannot zip the files and transfer it daily since it's 7 GB. Thanks

    Read the article

  • Linux: Dropbox alternative with sftp/rsync/... access

    - by Daniel
    Currently on my VPS I'm running couple of sites and Dropbox daemon to store backups. Problem is that VPS has 512M RAM and it was enough until new version of Wordpress (I don't know why, but now it consumes more memory) so I have a really bad choice: either stop Dropbox daemon and backups or to buy more memory (not that expensive but still). So I'm looking for some way to rsync data into Dropbox or similar service or figure out how to make Dropbox consume less memory. Any ideas?

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >