Search Results

Search found 498 results on 20 pages for 'sftp'.

Page 7/20 | < Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >

  • Is it possible to use rsync over sftp (without an ssh shell)?

    - by Tom Feiner
    Rsync over ssh, works great every time. However, trying to rsync to a host which allows only sftp logins, but not ssh logins, provides the following error: rsync -av /source ssh user@remotehost:/target/ protocol version mismatch -- is your shell clean? (see the rsync man page for an explanation) rsync error: protocol incompatibility (code 2) at compat.c(171) [sender=3.0.6] Here's the relevant section from the rsync man page: This message is usually caused by your startup scripts or remote shell facility producing unwanted garbage on the stream that rsync is using for its transport. The way to diagnose this problem is to run your remote shell like this: ssh remotehost /bin/true > out.dat then look at out.dat. If everything is working correctly then out.dat should be a zero length file. If you are getting the above error from rsync then you will probably find that out.dat contains some text or data. Look at the contents and try to work out what is producing it. The most com- mon cause is incorrectly configured shell startup scripts (such as .cshrc or .profile) that contain output statements for non-interactive logins. Trying this on my system produced the following in out.dat: ssh-dummy-shell: Command not allowed. As I thought, the host is not allowing ssh logins. The following link shows that it is possible to accomplish this task using fuse with sshfs - however it is extremely slow, and not fit for production use. Is there any chance of getting rsync sftp to work?

    Read the article

  • SFTP, SCP, Secure Webdav: which is the most suitable ?

    - by Xavier Maillard
    Hi, currently, I am hosting a webdav share setup in order to store files I need anywhere I am. It is available via HTTPS. Things are that I do not need all the HTTP machinery -i.e. my nginx http server is only there for this webdav folder. I am not sure I made the best choice. My requirements on the client side are: secured transfers mountable as a network drive at work with 'near realtime sync' usable for any OS I could use (including my mobile (android)) At first, I chose webdav since it would pass through my work proxy (which refuses all that is not on HTTP/S (port 80 or 443)). Today, I am not satisfied with the setup and even if nginx memory footprint is pretty small, its webdav support is not really "clean" and full. What would you recommend between SFTP, SCP and the current webdav solution ? I think SFTP is the closest solution but I still have to find out how to pass through my proxy ;) SCP seems quite limited as I read about it (only file transfers if I read right). Cheers

    Read the article

  • Is it possible to use rsync over sftp (without an ssh shell) ?

    - by Tom Feiner
    Rsync over ssh, works great every time. However, trying to rsync to a host which allows only sftp logins, but not ssh logins, provides the following error: rsync -av /source ssh user@remotehost:/target/ protocol version mismatch -- is your shell clean? (see the rsync man page for an explanation) rsync error: protocol incompatibility (code 2) at compat.c(171) [sender=3.0.6] Here's the relevant section from the rsync man page: This message is usually caused by your startup scripts or remote shell facility producing unwanted garbage on the stream that rsync is using for its transport. The way to diagnose this problem is to run your remote shell like this: ssh remotehost /bin/true > out.dat then look at out.dat. If everything is working correctly then out.dat should be a zero length file. If you are getting the above error from rsync then you will probably find that out.dat contains some text or data. Look at the contents and try to work out what is producing it. The most com- mon cause is incorrectly configured shell startup scripts (such as .cshrc or .profile) that contain output statements for non-interactive logins. Trying this on my system produced the following in out.dat: ssh-dummy-shell: Command not allowed. As I thought, the host is not allowing ssh logins. The following link shows that it is possible to accomplish this task using fuse with sshfs - however it is extremely slow, and not fit for production use. Is there any chance of getting rsync sftp to work?

    Read the article

  • WinSCP putting multiple files on sftp site

    - by NewToWinSCP
    WinSCP 5.2 I wanted to put multiple files with a file extension .pgp on an sftp site. When I tested my original command line (see below) and it only placed the first *.pgp alphabetical file (D:\a.csv.pgp) on the sftp site. I tried specifying *.PGP and *.pgp without any changes - only one file (D:\a.csv.pgp) would be copied each time. I got it to work for all files only if I specified a put command for each .pgp file. Any ideas on how to put all *.pgp on the sftp site? Original Command Line - Does Not Work d:\winscp\winscp /command "option echo off" "option batch on" "option confirm off" "open sftp" "put D:\*.pgp" "close" "exit" Works d:\winscp\winscp /command "option echo off" "option batch on" "option confirm off" "open sftp" "put D:\a.csv.pgp" "put D:\b.csv.pgp" "put D:\c.csv.pgp" "put D:\d.csv.pgp" "put D:\e.csv.pgp" "put D:\f.csv.pgp" "put D:\g.csv.pgp" "put D:\h.csv.pgp" "put D:\i.csv.pgp" "close" "exit"

    Read the article

  • Sftp via shell - how it is possible

    - by Tomasz Zielinski
    (Moved from StackOverflow: http://stackoverflow.com/questions/4589725/sftp-via-shell-how-it-is-possible) How is it possible for tools like http://mysecureshell.sourceforge.net/ to provide SFTP access by merely specifying them as shell by typing: usermod -s /bin/MySecureShell myuser ? I'm on Debian Lenny, with default sshd/OpenSSH. Is this e.g. a feature of SSH protocol that allows user shell to handle sftp commands? I can't wrap my head around this because usually OpenSSH needs sftp-server module (or the internal one in newer versions) - and this makes me think that sftp commands don't even hit the shell and are handled earlier or by different code path..

    Read the article

  • Sftp via shell - how is it possible?

    - by Tomasz Zielinski
    (Moved from StackOverflow: http://stackoverflow.com/questions/4589725/sftp-via-shell-how-it-is-possible) How is it possible for tools like http://mysecureshell.sourceforge.net/ to provide SFTP access by merely specifying them as shell by typing: usermod -s /bin/MySecureShell myuser ? I'm on Debian Lenny, with default sshd/OpenSSH. Is this e.g. a feature of SSH protocol that allows user shell to handle sftp commands? I can't wrap my head around this because usually OpenSSH needs sftp-server module (or the internal one in newer versions) - and this makes me think that sftp commands don't even hit the shell and are handled earlier or by different code path..

    Read the article

  • Is MySecureShell a good way to manage SFTP user with chroot on a Centos Server?

    - by benjisail
    Hi, I need to setup my Centos 5.4 server for SFTP with chrooted access only (or equivalent). The regular solution using RSSH find here : http://www.cyberciti.biz/tips/howto-linux-unix-rssh-chroot-jail-setup.html seems over-killing to me if we want to manage multiple users... I found the project MySecureShell which seems a lot simpler to install and to maintain. Is it a good solution? Is there something better? Thanks!!

    Read the article

  • How to change user for more rights on a SFTP client?

    - by Zenklys
    It is always suggested on first step to disable the remote root login for the SSH protocol. I have a low-right user able to connect via SSH and once connected, I simply su in order to gain more rights. Now when using a sFTP client, I use my low-right user and am thus able to do next to nothing. My question is : Is it possible to change user after login using 3rd party client, such as Transmit, Cyberduck, Filezilla ? PS : Mac clients would be great ;)

    Read the article

  • how can i check if a file is well transfered when using sftp ?

    - by achraf
    Hi guys, i'm working on a issue that i have on my sftp server. the issue is that when folder quota is reached and someone try to upload a file, he didnt get any error code and he deposit a 0 bytes file. so i want to know if there's any solution (integrity check for example) to check if the file is well transferred. thanks !!

    Read the article

  • Is it possible to rate-limit an scp/sftp/rsync/etc transfer from the command-line? ie, manual QoS on

    - by warren
    Specifically, I am looking to rate-limit an scp or sftp session (or other arbitrary network call) in the call itself. For example, let's say I want to copy 100MB to one server, and 1GB to another. I'd like to be able to run both of these at the same time, but maintain a QoS for "normal" computer usage - somewhat similar to how you can rate-limit bittorrent. Is there a way to do this without touching the networking hardware? I'm envisioning something akin to: magic-qos-tool 'scp file user@host:/path/to/file' Or.. scp -rate 40kbps file user@host:/path/to/file

    Read the article

  • How can I make sftp accounts that can only access their home directory?

    - by LonnieBest
    I'm using Ubuntu Server 9.10. I need to setup 6 accounts for users that will exclusively need sftp. When the user logs in, I need them to only have access to their home directory. I don't want them to be able to navigate to any other places in the file system. I need their ability to type commands to be limited to only the commands needed to view and transfer files back and forth to their home directory. Can anyone offer some guidance on setting thing up this way?

    Read the article

  • Is it possible to rate-limit an scp/sftp/rsync transfer from the command-line? ie, manual QoS on a s

    - by warren
    Specifically, I am looking to rate-limit an scp or sftp session in the call itself. For example, let's say I want to copy 100MB to one server, and 1GB to another. I'd like to be able to run both of these at the same time, but maintain a QoS for "normal" computer usage - somewhat similar to how you can rate-limit bittorrent. Is there a way to do this without touching the networking hardware? I'm envisioning something akin to: magic-qos-tool 'scp file user@host:/path/to/file' Or.. scp -rate 40kbps file user@host:/path/to/file

    Read the article

  • Linux shell to restrict sftp users to their home directories?

    - by hfidgen
    I need to give SFTP access to a directory within my webroot on my server. I've set up ben_files as a user and have set his/her home directory to /var/www/vhosts/mydomain.com/files That's all fine if he/she connects with plain old FTP - he's/she's restricted just to that directory, but to enable SFTP I had to add him/her to bin/bash shell, which suddenly opens up my entire server... Is there a way of giving him/her SFTP access but without opening up all my directories? I'd really like him restricted to only his/her home ;)

    Read the article

  • Are there any sFTP clients that support a proxy that uses NTLM authentication?

    - by Iraklis
    The title pretty much summarizes the question. We have deployed an sFTP server that needs to be accessed from our client's MS Windows Pc's that reside within a restricted local area network. The only way they can get out for their Intranet is to use an HTTP proxy that requires NTLM authentication. From what I understand all open-source sFTP clients (FilleZilla,WinSCP,etc) do NOT support NTLM authentication (Because of legal issues). I know that there are workarounds to this (installing a local proxy at the machine that understands NTLM) but this would break all sorts of security policies of our client. So my question is : Does anyone know of any sFTP client that supports NTLM ?

    Read the article

  • Resuming interrupted sftp transfers in Linux

    <b>Software Engineering with FOSS and Linux:</b> "It is often the case that an sftp transfer gets interrupted for one reason or another, resulting to partially transferred files. This can be annoying, especially in the case of large file transfers."

    Read the article

  • Basic clarification about Limited FTP/sFTP users

    - by mattewre
    I would like to get some clarification about the correct way to create limited users to access to my VPS user as WEBSERVER with Nginix. I'm used to NOT install FTP and access via SFTP only. It is ok for every set up? this is what I usually do from to create a limited user called "admin" that should be able to have access via SFTP to the folder with the website data mkdir -p /var/www/mysite.com/ adduser admin adduser admin www-data chown -R root:root /var/www chmod -R 755 /var/www chmod -R 755 /var/www/mysite.com chown -R admin:www-data /var/www/mysite.com/ It seems not to be the correct way, I always have problems with permission when I upload some files (for example with Wordpress in general). I would like to create an user that does work exactly as the one that the "provides" give to their client when they buy an Hosting service (that is a FTP, I would prefer SFTP access). It is for personal user, but I think that a limited user is a lot safer to use then the "root" via SFTP.

    Read the article

  • Interested in scp recipe for sftp [closed]

    - by GJZ
    You wrote in a reply this Blockquote The problem is that sftp runs as the user's id -- first, the sftp client ssh's into the target host as the given user, then runs sftp-server. Since sftp-server is running as a regular user, it has no way to "give away" a file (change owner of a file). However, if you are able to use scp, and assign a key pair to each user, you can get around this. This involves adding a user's key to root's ~/.ssh/authorized_keys file, with a "command=" parameter to force it to run a script that sanitizes and alters the arguments of the server-side scp program. I've used this technique before to set up an anonymous scp dropbox that allowed anyone to submit a file, and ensure that no one could retrieve submitted files and also prevent overwrites. If you are open to this technique, let me know and I'll update this post with a quick recipe. We are interested in this scp quick recipe for our community services file sharing. Best Regards, Gert Jan Zeilstra

    Read the article

  • Coda-like experience for Ubuntu

    - by Dillon Gilmore
    I'm a web developer who's going to transition from using Mac OS X to Ubuntu. I've been using Coda for some time, only because it makes web development easy. I know a full fledged app isn't available for Linux, but would like to know about apps that specialize in the same tasks that Coda offers. I plan on switching to Vim for code editing, I'm extremely proficient and will install the Janus plugin and be good to go for editing code. One thing that makes editing on Coda so amazing is its extremely good at SFTP, you can drag and drop files and/or folders from your local drive to the server. Also, you can edit code directly on the server. The problem here, is that using Vim I don't know of a way to edit code on a remote server, while using my own Vim settings and plugins. To solve this, I would like to know of a good SFTP client OR a good SFTP CLI. A CLI that could synchronize your files after a file has been modified would be perfect, but not necessary. Now, one of the biggest and best features of Coda is its ability to view your databases. You get to create a database, create tables, add stuff, delete stuff and view the contents of the table (all this without writing a single SQL statement). I will admit that databases are my weak point, but is a very important part of my job. If there is a tool that specializes in databases would be perfect. I wouldn't prefer to use the command line for database stuff, but if there is a CLI for databases that I'm missing could potentially be useful. So I guess I'm asking for two things. A tool that makes databases easier to visualize and a tool that assists in pushing my local code to a server.

    Read the article

< Previous Page | 3 4 5 6 7 8 9 10 11 12 13 14  | Next Page >