Search Results

Search found 9446 results on 378 pages for 'ssh keys'.

Page 36/378 | < Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >

  • Removing port forwardings programmatically on a ControlMaster SSH session

    - by aef
    Quite a while ago I got an answer telling me how to add a port-forwarding on a running SSH ControlMaster process. To know that helps a lot, but I'm still missing a way to remove such a port forwarding after I don't need that anymore. As far as I know, you can do that through the internal command key sequence on normal connections, this seems to be disabled for ControlMaster clients. Even if that would be possible I would need a solution which I can automatize with scripts, which is surely not so easy this way. Is there a way to do it? And is it easily automatizable?

    Read the article

  • X forwarding over SSH from Mac to a Linux box

    - by Checkers
    I need run Mac applications on a remote Mac machine and display it on a local Linux machine's X server (a lot of articles on the Internet seem to be detailing how would you do it the opposite way). $ ssh -X mac-box $ cd /Developer/Applications/Xcode.app $ ./Contents/MacOS/Xcode Sat Oct 3 20:41:26 mac-box.local Xcode[15634] <Error>: kCGErrorFailure: Set a breakpoint @ CGErrorBreakpoint() to catch errors as they are logged. _RegisterApplication(), FAILED TO establish the default connection to the WindowServer, _CGSDefaultConnection() is NULL. ^C My $DISPLAY variable appears to be empty. What should it look like so that forwarding works correctly? Can I run OSX applications this way at all?

    Read the article

  • Running 'sudo' over SSH

    - by Wesho
    I'm writing a script which is to log onto a bunch of remote machines and run a command on them. I've set up keys so the user running the script does not have to type the password of each machine, but only type in the passphrase in the beginning of the script. The problem is that the command on the remote machines requires sudo to run. And at the same time the whole point of the script is to rid the user of having to type in passwords multiple times. Is there way to avoid typing in the password for sudo? Changing permissions of the command on the remote machines is not an option.

    Read the article

  • How to debug ssh authentication failures with gssapi-with-mic

    - by Arthur Ulfeldt
    when i ssh to DOMAIN\user@localhosts-name authentication works fine through gssapi-with-mic: debug3: remaining preferred: gssapi,publickey,keyboard-interactive,password debug3: authmethod_is_enabled gssapi-with-mic debug1: Next authentication method: gssapi-with-mic debug2: we sent a gssapi-with-mic packet, wait for reply debug3: Wrote 112 bytes for a total of 1255 debug1: Delegating credentials debug3: Wrote 2816 bytes for a total of 4071 debug1: Delegating credentials debug3: Wrote 80 bytes for a total of 4151 debug1: Authentication succeeded (gssapi-with-mic). when I connect to a different machine It just seems to stop half way through the gssapi-with-mic authentication: debug1: Next authentication method: gssapi-with-mic debug2: we sent a gssapi-with-mic packet, wait for reply debug3: Wrote 112 bytes for a total of 1255 debug1: Delegating credentials debug3: Wrote 2816 bytes for a total of 4071 <----- ???? debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password,keyboard-interactive How should I go about finding out what happened differently the second time. How can I find out if/why the auth was rejected by kerberos?

    Read the article

  • Vim and mouse with ssh from Mac to Linux

    - by Jonatan Littke
    Hey. I certainly know it's possible to make the mouse work in Vim on a remote session to a Linux machine from my Mac, but I haven't figured out just how. Daily vim gives a tip on making it work but if I try to set 'mac-ansi', for example, I get an error saying I can only specify values beginning with builtin_ (riscos, beos-ansi, etc). I've tried using ssh -X, combined with set mouse=a and set term=builtin_anso for example, but with no success with or without combining them. I'm using Snow Leopard and attempting to use the mouse on a Debian machine with vim 7.1.314. I've had a look at the documentation but can't make it work. Any tips?

    Read the article

  • sed works properly in SSH, not in PHP

    - by David
    So, I have the following line that I run in PHP with exec($addPHPtags); $addPHPtags = "/bin/sed -i '/<BODY BGCOLOR=\"#FFFFFF\">/ a\ <?php \n ?> '" . $instance['file'] . " 2>&1"; I'd expect that command to find the key and append it with a php tag. However, when I run it in PHP, I get the following error if I trap command output: [0] => /bin/sed: -e expression #1, char 39: unknown command: `?' However, if I run the same command in SSH, it works completely fine: /bin/sed -i '/<BODY BGCOLOR=\"#FFFFFF\">/ a\ <?php \n ?>' file.php I'm out of ideas, I've tried various alternatives but to no avail. Any help ? Thanks.

    Read the article

  • Using SSH to find access to a problematic script in logs of multiple domains

    - by Hanan Cohen
    I run several (~20) sites on a Dreamhost VPS. Lately I max my memory allocation for the VPS and I want to find the problem. I would like to have an SSH script that will scan all the log files of all the domains and show me what object (image, php script etc) gets lots of calls. It will count the calls in each /logs/*/http/access.log, do an descending sort and show me the top 10 across domains. But I don't know how to do that. Can it be done? Can anyone suggest a script that will do that? Thanks. (Cross posted to Stack Overflow)

    Read the article

  • chroot'ing SSH home directories, shell problem.

    - by Hamza
    Hi folks, I am trying to chroot my SSH users to their home directories and it seems to work.. in a strange way. Here is what I have in my sshd_config: Match group restricthome ChrootDirectory %h The permissions on the user directories looks like this: drwxr-xr-x 2 root root 1024 May 11 13:45 [user]/ And I can see that the user logs in successfully: May 11 13:49:23 box sshd[5695]: Accepted password for [user] from x.x.x.x port 2358 ssh2 (with no error messages after this) But after entering the password the PuTTY window closes down. This is a wild guess, but could it be because the user's shell is set to /bin/bash and it can't execute because of the chroot? If so, could you give me pointers on how to fix it? Would simply copying the bash binary into user's home directory and modyfying the shell work? How would I deal with the dependencies, ldd shows quite a few of those :) Comments/suggestions will be appreciated. Thanks.

    Read the article

  • Transfer files using ssh

    - by zozo
    Good day to all. I am using ssh (winSCP) to transfer some files from a server to my workstation. The problem is that at some files I get disconnected. Always same files. I am the owner of the directory so I guess the file permissions is not a problem. (Also I set the permissions to 777). Is there a size limit or something like that? Thank you for your time. Protocol is SFTP, server is 32bit machine. Files are 100MB tops. Added: Worked with filezilla using ftp. This temporarily fix the problem but is not exactly a solution since maybe next time I won't have root access to create a ftp account

    Read the article

  • SSH access from outside to a pc inside network

    - by Raja
    I have a static IP and ADSL router linked to a linksys wireless router to which all my machines are connected. I would want to setup SVN on one of machines and provide SSH access which should be accessible by users outside my network. Would this be possible? Even just SVN access through web should be fine. Please let me know what all things should be done to achieve this ? I have Ubuntu VM running in a iMac Leopard machine and another 2 Win 7 32/64 bit machines. I can setup standalone Ubuntu or Win XP on another machine. Thanks, Raja.

    Read the article

  • sftp chroot access via SSH

    - by Cudos
    Hello. I have this setup in sshd_config: AllowUsers test1 test2 Match group sftpgroup ChrootDirectory /var/www X11Forwarding no AllowTcpForwarding no ForceCommand internal-sftp Match user test2 ChrootDirectory /var/www/somedomain.dk X11Forwarding no AllowTcpForwarding no ForceCommand internal-sftp I am trying to restrict test2 to only use /var/www/somedomain.dk For some reason when I try to login e.g. with Filezilla on account test2 I get this error: "Server unexpectedly closed network connection" The users are created and works. the SSH service has been stopped and started. test1 works when using e.g. filezilla and the root of the connection is /var/www. What am I doing wrong?

    Read the article

  • Security when, ssh Private keys are lost

    - by Shree Mandadi
    Cant explain my problem enough with words, Let me take an example.. and please multiple the complexity by a 100 for the Solution. User-A has two ssh private keys, and over time has used this public key on a number of servers He lost one of them, and has created a new pair. How does User-A, inform me (Sys Admin), that he has lost his key, and How do I manage all the servers to which he had access to (I do not have a list, of all Servers that User-A has access to). In other words, How do I recall, the public key associated with this Private key. REF: In the LDAP based Authentication, All Servers would communicate with a single Server repository for Authentication, and If I remove acess or modify the password on the Server, all Systems that use this LDAP for Authentication are secured, when User-A loses his password..

    Read the article

  • Run python script on server over ssh session in the background persistantly

    - by Stefan R. Falk
    I got an account from my professor for our universities CUDA server for running some tests. I am connecting via ssh over terminal. The thing is, as I close the terminal the server also seems to kill the running script. As I reconnect it has stopped. No it is not possible that the script already terminated since those test runs should take a few hours even on those machine.. Can anybody help me here? OS: Linux cuda01 3.13-1-amd64 #1 SMP Debian 3.13.7-1 (2014-03-25) x86_64 GNU/Linux

    Read the article

  • Bash loop to move directories on a remote host via ssh

    - by I Forgot
    I'm trying to figure out a way to perform the following loop on a remote host via ssh. Basically it renames a series of directories to create a rotating backup. But it's local. I want it to work against directories on a remote host. while [ $n -gt 0 ]; do { src=$(($n-1)) dst=$n if [ -d /backup/$src ]; then { mv /backup/$src /backup/$dst; } fi; } ((n--)) done;

    Read the article

  • Reinstall linux over ssh.

    - by DoomStone
    Hello I'm having a large problem with our development server, it have had a program called Webmin + a total idiot have been administrating the Linux sever. Witch now have resulted in the server being totally trashed, there are so many different kinds of the same program install that anything doesn’t work. And don't get me started on the users and groups :D Well at last I have been given the responsibility to administrate our development server. But I would like to start from scratch, instead of trying to find every single package and config the previous admin have **ed up. But the problem is that it is a remote hosted server with ssh access. The server is running Debian, but i am thinking of reinstalling it with ubuntu server Thanks

    Read the article

  • How to browse to a webserver which is reachable through the SSH port only

    - by GetFree
    I have a server at work which is behind a firewall (the company's firewall) so it is reachable only thrugh port 22 (SSH). I'm able to connect to the server with putty without problems. Also, that server has Apache running and listening on port 80 as usual. But I cant connect to the website using my browser since port 80 (and everyone else) is blocked by the company's firewall. Is there a way I can make my browser to connect to Apache in that server so I can browse the site I'm working on? Thanks.

    Read the article

  • keymapping when ssh-ing from mac to linux

    - by Yair
    I'm using Lion to ssh -X to a linux machine and work on some code thats located on it. I open up an editor on the remote machine (usually matlab) and program on it. My problem is that in the linux there is no concept of the command key. So if I want to copy some text from a local window to the editor that runs on the remote, I need to to command-c to copy, and then control-v to paste. This obviously drives me nuts. I was wondering if there is a way to change the keymapping such that the command key will be recognized as a control key on the remote processes. Or is this something I need to change on my local (mac) X configuration?

    Read the article

  • Restrict access to SSH for one specific user

    - by j0nes
    I am looking for a way to secure my servers with the following setup: I have a server where I can log in via SSH. The main account there (named "foo") is secured by a keybased login with password. I have another user account (named "bar") that I use to log in via cronjobs running on other servers - this one also has keybased login, but without password. Now I want to limit access to this machine for the "bar" account. The account should only be accessible via known IPs. However, the "foo" account should not be affected by this, this one should basically be accessible from any IP. How can I manage this? Or is there a simpler solution to everything?

    Read the article

  • Connect through SSH and type in password automatically, without using a public key

    - by binary255
    A server allows SSH connections, but not using public key authentication. It's not within my power to change this at the moment (due to technical difficulties, not organizational) but I will get on it as soon as possible! What I need now is to execute commands on the server using plain old account+password authentication from a script. That is, I need to do it in a non-interactive way. Is it possible? And how do I do it? The client which will be executing the script runs Ubuntu Server 8.04. The server runs Cygwin and OpenSSH.

    Read the article

  • how to have files created by CMS have the same ownership as SSH user

    - by Cam
    I am having difficulty on our ubuntu server whereby I have an SSH user that when I create files using this user the ownership is web_user:www-data The problem is when a file is uploaded or created using a content management system like joomla. When files are uploaded through Joomla - such as components / modules... The ownership is set to www-data:www-data This means that I need to then chown all new files to web_user:www-data so we can edit the files. Is there a way to set for a directory and sub-directories that all new files created have the ownership of web_user:www-data? Do I need to use something like setuid or setgid? Any help would be greatly appreciated.

    Read the article

  • how to have files created by CMS have the same ownership as SSH user

    - by Cam
    I am having difficulty on our ubuntu server whereby I have an SSH user that when I create files using this user the ownership is web_user:www-data The problem is when a file is uploaded or created using a content management system like joomla. When files are uploaded through Joomla - such as components / modules... The ownership is set to www-data:www-data This means that I need to then chown all new files to web_user:www-data so we can edit the files. Is there a way to set for a directory and sub-directories that all new files created have the ownership of web_user:www-data? Do I need to use something like setuid or setgid? Any help would be greatly appreciated.

    Read the article

  • How to load previous kernel via ssh?

    - by Aamir
    I work remotely on my work computer. I am also the root of the workstation that I am sharing with my colleague. Yesterday, I upgraded the kernel to 2.6.31-17 when asked by the update manager but refrained from restarting because I knew my NVIDIA and several other kernel modules wouldn't work. Unfortunately my colleague who is a linux noob restarted the machine and here I am :( I am thinking of changing the symlinks /initrd.img and /vmlinuz to the previous kernel image and use kexec. Please tell some better way to load the 2.6.31-16 release of kernel from ssh and not the grub boot menu. I am using Ubuntu Karmic.

    Read the article

  • SSH reverse tunnel to monitor and manage remote devices

    - by acid_crucifix
    I have a set a distributed set of devices running Ubuntu 12.04 that I am distributing to clients. I would like to manage them remotely. They may not have fixed IPs and potentially might be behind firewalls. What I am planning to do is have the devices (permanently connected to the net) poll a request URL and based on the response open a reverse tunnel to my server, so that I can access them via that tunnel. Most of what I read about reverse tunnel over SSH is for single use cases and very little about heavy production usage. Is there some reason for this, security issues? or stability? Any help would be much obliged.

    Read the article

  • SSH only works after intentionally failed password

    - by pyraz
    So, I'm having a rather weird problem. I have a server, that when I try to SSH into, immediately closes the connection if I type in the correct password on the first attempt. However, if I purposefully enter a wrong password on the first attempt, and then enter a correct password at the second or third prompt, it successfully logs me into the computer. Similarly, when I try to use public key authentication, I get an immediate closed connection. If, however, I enter a wrong password for my key file, followed by another wrong password once it reverts to password authentication, I can successfully log in as long as I provide the correct password at the second or third prompt. The machine is running Red Hat Enterprise Linux Server release 6.2 (Santiago), and is using LDAP and PAM for authentication. Any ideas on where to start debugging this one? Let me know what config files I need to provide and I'll be happy to do so.

    Read the article

  • Problems when loop over a series of ssh-ed commands

    - by Jack Medley
    I have a series of server machines which I want to run the same command on. Each command takes hours and (even though I am running the commands using nohup and setting them to run in the background) I have to wait for each to finish before the next starts. Here is roughly how I have set it up: On the host machines: for i in {1..9}; do ssh RemoteMachine${i} ./RunJobs.sh; done Where RunJobs.sh on each remote machine is: source ~/.bash_profile cd AriadneMatching for file in FileDirectory/Input_*; do nohup ./Executable ${file} & done exit Does anyone know of a way such that I dont have to wait for each job to finish before the next starts? Or alternatively a better way of doing this, I have a feeling what I am do is fairly sub-optimal. Cheers, Jack

    Read the article

< Previous Page | 32 33 34 35 36 37 38 39 40 41 42 43  | Next Page >