Search Results

Search found 3730 results on 150 pages for 'bash'.

Page 44/150 | < Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >

  • Externally disabling signals for a Linux program.

    - by Harry
    Hello, On Linux, is it possible to somehow disable signaling for programs externally... that is, without modifying their source code? Context: I'm calling a C (and also a Java) program from within a bash script on Linux. I don't want any interruptions for my bash script, and for the other programs that the script launches (as foreground processes). While I can use a... trap '' INT ... in my bash script to disable the Ctrl C signal, this works only when the program control happens to be in the bash code. That is, if I press Ctrl C while the C program is running, the C program gets interrupted and it exits! This C program is doing some critical operation because of which I don't want it be interrupted. I don't have access to the source code of this C program, so signal handling inside the C program is out of question. #!/bin/bash trap 'echo You pressed Ctrl C' INT # A C program to emulate a real-world, long-running program, # which I don't want to be interrupted, and for which I # don't have the source code! # # File: y.c # To build: gcc -o y y.c # # #include <stdio.h> # int main(int argc, char *argv[]) { # printf("Performing a critical operation...\n"); # for(;;); // Do nothing forever. # printf("Performing a critical operation... done.\n"); # } ./y Regards, /HS

    Read the article

  • How to run bash script from windows using plink on linux

    - by user128877
    I'm trying to run a simple bat file from windows that will run a bash script on linux machine. The bash script is located on the linux machine. For example: I'm trying to run this bat file from windows plink.exe -pw <password> root@<ip> bash -c "/root/script.sh" Result: When running from windows the cmd is stuck forever when running the specific script (/root/script.sh) from the linux machine it's working just fine. The script contain ruby code and I'm using RVM

    Read the article

  • have a bash script remotely shutdown another computer on the lan

    - by gletscher
    Hi I want to write a bash script that when called shuts down another computer on the lan. Maybe using ssh? The other computer is an ubuntu machine. Now I'm not sure how to send e.g. a sudo shutdown -h now command from withing a bash script to the ssh after logging in. Also I'm not sure how to obtain the rights for the sudo command, hence how to handle the communication between the server and client from within a bash script. Any suggestions are greatly appreciated.

    Read the article

  • Windows CMD, show the current folder name at prompt dynamically like Bash

    - by guneysus
    I am trying to modify my CMD, to show only current dir name dynamically like: Desktop $ When i switched the folder, it must be updated. It is not required to be code in purely batch file, it may depend any external commands, cygwin bash, etc. @echo off set a=bash -c "pwd | sed 's,^\(.*/\)\?\([^/]*\),\2,'" %a% cmd outputs _test-et Microsoft Windows [Version 6.3.9600] (c) 2013 Microsoft Corporation. Tüm haklari saklidir. >> But >> prompt %a% gives bash -c "pwd | sed 's,^\(.*/\)\?\([^/]*\),\2,'"

    Read the article

  • grep is inconsistently defaulting to grep -P?

    - by Sammitch
    I have a script that does some housekeeping that works perfectly well when invoked from an interactive shell, but did nothing when invoked by cron. To troubleshoot this I started a shell with a 'blank' environment with the command: env -i /bin/bash --noprofile --norc Using this blank env I've dug into my script and found that the following grep will not match any files: grep -il "^ws_status\s*=\s*[\"']remove[\"']$" However, when run from an interactive shell the command will return the filenames of the matching files. As a note, the expression is matching lines like: WS_STATUS = "remove" Through trial-and-error I discovered that adding -P to the options [Perl regex] the command started working normally in the 'blank' shell. However, I have no idea why my login shell appears to be defaulted to grep -P. There is only one grep binary, /bin/grep There are no aliases defined for grep=pgrep or grep="grep -P" There is no env variable GREP_OPTIONS defined. What's the deal here? Note: OS is RHEL v5.10, Bash is v3.2.25, grep is v2.5.1

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • Different color prompts for different machines when using terminal/ssh?

    - by bcrawl
    I have 5 machines I constantly ssh into to do work. Its getting increasingly frustrating when I am issuing wrong commands on wrong boxes. Luckily I havent done anything bad yet. I wanted to know if there is any hack which I can hardcode which will display my prompt in different colors based on the machine I am ssh into? Such as blue for desktop1, purple for laptop, red for server etc? Is this possible? Currently I am using this command export PS1="\e[0;31m[\u@\h \W]\$ \e[m " taken from here http://www.cyberciti.biz/faq/bash-shell-change-the-color-of-my-shell-prompt-under-linux-or-unix/ but it obviously doesnt work across ssh. Also, if you have any other cool bash tips for helping me ease my sight will be wonderful. I got this tip which colors the man pages. http://linuxtidbits.wordpress.com/2009/03/23/less-colors-for-man-pages/

    Read the article

  • What are the common Control combinations in a terminal setting

    - by Hamish Downer
    I would like to have a good guide to the common Control key combinations in use in bash (and similar) shells and the combinations used by common programs in use in those shells. My particular motivation is to be able to run GNU screen on one computer, ssh to a second computer and use screen and irssi on that computer. So I need to use something other than Ctrl-A to control one of the screen sessions. So I need to know what are Control key combinations are safe to use. But I imagine this list would be useful for others who want to bind custom actions to Control key combinations. I reckon we'd be best to group the Control key combinations by application (eg. bash itself, screen, vim, emacs), to make it easy to spot the applications you use or can ignore. So please one application per answer - hope that works.

    Read the article

  • How does Mac's command line compare to Linux?

    - by Nathan Long
    I love Ubuntu Linux - especially the commmand line. But I have to admit that, at least for now, Windows is more user-friendly - there's more software for it, more drivers, and more stuff just works. Knowing that Mac is built on Unix makes me wonder if it's the sweet spot between them. But I wonder: how similar is the Mac command line to Linux's bash? Could I pick right up with using vim and bash scripting and git, etc? Would common commands like changing directories be different? Does anybody know an online "compare and contrast" resource?

    Read the article

  • Next generation of command shells?

    - by ignatius
    I am curious about if there is any project about a replacement for the current unix-shells (like bash, ash, rsh ...), at least adding some new ideas or paradigm in this area. I was searching but i found very few information, this project http://en.wikipedia.org/wiki/Friendly_interactive_shell seems interesting, but not so diferent from the nowadays solutions. What do you think? Do you imagine a linux-distribution on 2020 that still having bash? How can be an evolution of this programs? Br To be clearer, about new ideas, i was talking of something like: control-Z functionality Colaboration features (like remote desktop) so you can invite someone to join and participate on your shell session Possibility to see the result of a command before to really apply it to your system (this is closely related with the 1st point ctl-Z) etc...

    Read the article

  • Running ssh-agent from a shell script

    - by Dan
    I'm trying to create a shell script that, among other things, starts up ssh-agent and adds a private key to the agent. Example: #!/bin/bash # ... ssh-agent $SHELL ssh-add /path/to/key # ... The problem with this is ssh-agent apparently kicks off another instance of $SHELL (in my case, bash) and from the script's perspective it's executed everything and ssh-add and anything below it is never run. How can I run ssh-agent from my shell script and keep it moving on down the list of commands?

    Read the article

  • Use inotifywait and lftp to synchronize servers

    - by KBoek
    I have two servers: Server A (CentOS), where people can upload files to (upload root is /files) Server B (Win 2008), with FileZilla FTP Server (FTP root is C:\content) I want that whenever a file is uploaded to Server A, to any subfolder under /files, the file is automatically copied to the exact same subfolder on Server B. Thus, if a user uploads "flowers.jpg" to /files/photos/12345/ then the file must be copied over FTP to C:\content\photos\12345 So far I have this bash script, it does copy the files to server B, but all files are placed in C:\content, and not in the corresponding subfolders. Who can help me find the correct syntax? #!/bin/bash cd /files inotifywait -q -r -m -e close_write,moved_to . --format %w%f | while read FILE; do lftp -e "put $FILE; exit" -u user,password -p 2121 ftp.server-a.com done

    Read the article

  • Linux terminal - frozen update of input but can execute commands?

    - by Torxed
    How do i restart a shell session from within SSH when it looks something like this: anton@ubuntu:~$ c: command not found anton@ubuntu:~$ lib anton@ubuntu:~$ this is working, but its messed up anton@ubuntu:~$ I can execute commands, but as i input them nothing shows on the console, but as soon as i press enter the command executes and the output comes (without line-endings, as shown above) exec bash bash --login clear nothing really works, restarting the SSH session however works. Temporary solution is to start a screen session and every time the interface freezes you simply do Ctrl+a-c to start a new session and close the old one..

    Read the article

  • Using ctrl-arrow keys with PuTTY and screen

    - by kbosak
    I searched and couldn't find a solution for this anywhere. I'm using PuTTY from Windows to connect to various servers where I run bash and screen. It seems bash works fine with ctrl-arrow keys to jump word-to-word on the command line but within screen it's not working. Not in screen, ctrl-left sends "^[OC and ctrl-right is "^[OD". Within screen I instead get "^[[C" and "^[[D", which appears to be the codes for just the left/right arrow keys. Is there any way to get screen to recognize ctrl-arrow keys when using PuTTY? (FYI, I don't remember having this problem when using gnu-terminal in linux instead of PuTTY).

    Read the article

  • Redirect output of Python program to /dev/null

    - by STM
    I have a Python executable, written and compiled by somebody else, that I simply need to run once halfway down my own bash script. The program uses a text-based UI, therefore waits for input before proceeding, but the key operations it performs when starting are required in my bash script. A messy (and strange) procedure I know, but unfortunately I haven't got any other options. I've gotten around forcefully closing the program with a kill signal, but the program's TUI insists on outputting to wherever it's run. I've tried redirecting both stdout and stderr to /dev/null and running the program in the background by suffixing an ampersand, but simply can't get it to play ball. I believe the cause is the program spawns other processes, and the output redirection of the parent process doesn't affect them. Is there any trick I can utilise to redirect all output from child processes too?

    Read the article

  • Using ctrl-arrow keys with PuTTY and screen

    - by kbosak
    I searched and couldn't find a solution for this anywhere. I'm using PuTTY from Windows to connect to various servers where I run bash and screen. It seems bash works fine with ctrl-arrow keys to jump word-to-word on the command line but within screen it's not working. Not in screen, ctrl-left sends "^[OC and ctrl-right is "^[OD". Within screen I instead get "^[[C" and "^[[D", which appears to be the codes for just the left/right arrow keys. Is there any way to get screen to recognize ctrl-arrow keys when using PuTTY? (FYI, I don't remember having this problem when using gnu-terminal in linux instead of PuTTY). UPDATE: It appears PuTTY is the problem as it is not sending the escape codes that are necessary for this to work. I'm giving up for now and using Cygwin+mintty.

    Read the article

  • Perform shell operation through secure shell

    - by Ben
    Is it possible to perform a shell operation from a bash script through a secure shell. Here is an example of why you may want to do this. Lets say you have a simple unix operating system that you need only build and run on, but you want to do all of the development on another machine. I want to write a bash script that has the following functionality: scp file to location on other machine ssh to other machine cd into correct directory make run program scp results to file on original computer exit ssh Is this remotely possible? (Pardon the Pun :p)

    Read the article

  • What happens to running processes when I lose a remote connection to a *nix box?

    - by David Marble
    I occasionally lose my remote SSH connection to my VPS. I use screen for long-running processes, but am wondering what happens to the processes I had running aside from those run within a screen session if I lose the connection to the box. When I re-establish a connection to the box, what happened to the bash and sshd processes that were running when I lost the connection? Today I lost connection repeatedly and noticed many more bash and sshd processes than usual. If there are processes hanging around, do I need to kill them? How could I determine which processes were abandoned from my previous session? Thanks for any replies!

    Read the article

  • Netcat file transfer problem

    - by thepurplepixel
    I have two custom scripts I just wrote to facilitate transferring files between my VPS and my home server. They are both written in bash (short & sweet): To send: #!/bin/bash SENDFILE=$1 PORT=$2 HOST='<my house>' HOSTIP=`host $HOST | grep "has address" | cut --delimiter=" " -f 4` echo Transferring file \"$SENDFILE\" to $HOST \($HOSTIP\). tar -c "$SENDFILE" | pv -c -N tar -i 0.5 | lzma -z -c -6 | pv -c -N lzma -i 0.5 | nc -q 1 $HOSTIP $PORT echo Done. To receive: #!/bin/bash SERVER='<myserver>' SERVERIP=`host $SERVER | grep "has address" | cut --delimiter=" " -f 4` PORT=$1 echo Receiving file from $SERVER \($SERVERIP\) on port $PORT. nc -l $PORT | pv -c -N netcat -i 0.5 | lzma -d -c | pv -c -N lzma -i 0.5 | tar -xf - echo Done. The problem is that, for a very quick second, I see something flash along the lines of "Connection Refused" (before pv overwrites it), and no file is ever transferred. The port is forwarded through my router, and nmap confirms it: ~$ sudo nmap -sU -PN -p55515 -v <my house> Starting Nmap 5.00 ( http://nmap.org ) at 2010-04-21 18:10 EDT NSE: Loaded 0 scripts for scanning. Initiating Parallel DNS resolution of 1 host. at 18:10 Completed Parallel DNS resolution of 1 host. at 18:10, 0.00s elapsed Initiating UDP Scan at 18:10 Scanning 74.13.25.94 [1 port] Completed UDP Scan at 18:10, 2.02s elapsed (1 total ports) Host 74.13.25.94 is up. Interesting ports on 74.13.25.94: PORT STATE SERVICE 55515/udp open|filtered unknown Read data files from: /usr/share/nmap Nmap done: 1 IP address (1 host up) scanned in 2.08 seconds Raw packets sent: 2 (56B) | Rcvd: 5 (260B) Also, running netcat normally doesn't work either: squircle@summit:~$ netcat <my house> 55515 <my house> [<my IP>] 55515 (?) : Connection refused Both boxes are Ubuntu Karmic (9.10). The receiver has no firewall, and outbound traffic on that port is allowed on the sender. I have no idea what to troubleshoot next. Any ideas? P.S.: Feel free to move this to SO/SF if you feel it would fit better there.

    Read the article

  • Run a script after killing lxsession (xorg)

    - by user284194
    I am trying to run a program automatically within a bash script after killing the LXDE session. My script consists of: #!/bin/sh pkill lxsession; sh /home/pi/RetroPie/EmulationStation/emulationstation My aim is to log out of the LXDE session and run EmulationStation on my Raspberry Pi with a bash script. I'm using pkill lxsession; to bypass lxsession's logout confirmation dialog. As it stands, this script just gets me to the command line from a working LXDE desktop. Thanks for reading.

    Read the article

  • Rsync plugin to many local wordpress installs via script or cli

    - by Nick Abbey
    I am maintaining a large number of wordpress installs on a production server, and we are looking to deploy InfiniteWP for managing these installs. I am looking for a way to script the distribution of the plugin folder to all of these installs. On server wp-prod, all sites are stored in /srv//site/ The plugin needs to be copied from ~/iws-plugin to /srv//site/wp-content/plugins/ Here's some pseudo code to explain what I need to do: array dirs = <all folders in /srv> for each d in dirs if exits "/srv/d/site/wp-content/plugins" rsync -avzh --log-file=~/d.log ~/plugin_base_folder /srv/d/site/wp-content/plugins/ else touch d.log echo 'plugin folder for "d" not found' >> ~/d.log end end I just don't know how to make it happen from the cli or via bash. I can (and will) tinker with a bash or ruby script on my test server, but I'm thinking the command-line-fu here on SF is strong enough to handle this issue much more quickly than I can hack together a solution. Thanks!

    Read the article

  • How to close all background processes in unix?

    - by Gabi Purcaru
    I have something like: cd project && python manage.py runserver & cd utilities && ./coffee_auto_compiler.py And I want both of them to close on Ctrl-C (or some other command). How can I accomplish that? EDIT: I tried using jobs -x kill and kill `jobs -p `, but it doesn't seem to kill what I need. Here is what I mean: moon 8119 0.0 0.0 7556 3008 pts/0 S 13:17 0:00 /bin/bash moon 8120 6.8 0.4 24568 18928 pts/0 S 13:17 0:00 python manage.py runserver jobs -p give me just process 8119, but I also need to close 8120, since it's the thing that the first command opened. If it helps, the commands are actually in a Makefile, and I want it to run two daemons at the same time (and somehow close them at the same time). And yes, I'm using ubuntu, with bash

    Read the article

  • exported variable not persisted after script execution

    - by Daniele
    I'm facing a wierd issue. I've a vm with solaris 11, and trying to write some bash scripts. if, on the shell, I type : export TEST=aaa and subsequently run: set I correctly see a new environment variable named TEST whose value is aaa. If, however I do basically the same thing in a script. when the script terminates, I do not see the variable set. To make a concrete example, if in a file test.sh I have: #!/usr/bin/bash echo 1: $TEST #variable not defined yet, expect to print only 1: echo 2: $USER TEST=sss echo 3: $TEST export TEST echo 4: $TEST it prints: 1: 2: daniele 3: sss 4: sss and after its execution, TEST is not set in the shell. Am I missing something? I tried both to do export TEST=sss and the separate variable set/export with no difference.

    Read the article

< Previous Page | 40 41 42 43 44 45 46 47 48 49 50 51  | Next Page >