Search Results

Search found 6758 results on 271 pages for 'shell exec'.

Page 26/271 | < Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >

  • Read a variable in bash with a default value

    - by rmarimon
    I need to read a value from the terminal in a bash script. I would like to be able to provide a default value that the user can change. # Please enter your name: Ricardo^ In this script the prompt is "Please enter your name: " the default value is "Ricardo" and the cursor would be after the default value. Is there a way to do this in a bash script?

    Read the article

  • AWK scripting :How to remove Field separator using awk

    - by anil-1985
    Need the following output ONGC044 ONGC043 ONGC042 ONGC041 ONGC046 ONGC047 from this input Medium Label Medium ID Free Blocks =============================================================================== [ONGC044] ECCPRDDB_FS_43 ac100076:4aed9b39:44f0:0001 195311616 [ONGC043] ECCPRDDB_FS_42 ac100076:4aed9b1d:44e8:0001 195311616 [ONGC042] ECCPRDDB_FS_41 ac100076:4aed9af4:4469:0001 195311616 [ONGC041] ECCPRDDB_FS_40 ac100076:4aed9ad3:445e:0001 195311616 [ONGC046] ECCPRDDB_FS_44 ac100076:4aedd04a:68c6:0001 195311616 [ONGC047] ECCPRDDB_FS_45 ac100076:4aedd4a0:6bf5:0001 195311616

    Read the article

  • extract payload from tcpflow output

    - by Felipe Alvarez
    Tcpflow outputs a bunch of files, many of which are HTTP responses from a web server. Inside, they contain HTTP headers, including Content-type: , and other important ones. I'm trying to write a script that can extract just the payload data (i.e. image/jpeg; text/html; et al.) and save it to a file [optional: with an appropriate name and file extension]. The EOL chars are \r\n (CRLF) and so this makes it difficult to use in GNU distros (in my experiences). I've been trying something along the lines of: sed /HTTP/,/^$/d To delete all text from the the beginning of HTTP (incl) to the end of \r\n\r\n (incl) but I have found no luck. I'm looking for help from anyone with good experience in sed and/or awk. I have zero experience with Perl, please I'd prefer to use common GNU command line utilities for this Find a sample tcpflow output file here. Thanks, Felipe

    Read the article

  • Check for messages apache Qpid

    - by c0mrade
    Is it possible to check for messages from Qpid queue from unix/windows console? Here is how I check via GUI : http://i47.tinypic.com/pbu5d.gif I can see all the info from Qpid JMX Management Console, is there a something close to this that I can use in console?

    Read the article

  • Trouble with piping through sed

    - by Joel
    I am having trouble piping through sed. Once I have piped output to sed, I cannot pipe the output of sed elsewhere. wget -r -nv http://127.0.0.1:3000/test.html Outputs: 2010-03-12 04:41:48 URL:http://127.0.0.1:3000/test.html [99/99] -> "127.0.0.1:3000/test.html" [1] 2010-03-12 04:41:48 URL:http://127.0.0.1:3000/robots.txt [83/83] -> "127.0.0.1:3000/robots.txt" [1] 2010-03-12 04:41:48 URL:http://127.0.0.1:3000/shop [22818/22818] -> "127.0.0.1:3000/shop.29" [1] I pipe the output through sed to get a clean list of URLs: wget -r -nv http://127.0.0.1:3000/test.html 2>&1 | grep --line-buffered -v ERROR | sed 's/^.*URL:\([^ ]*\).*/\1/g' Outputs: http://127.0.0.1:3000/test.html http://127.0.0.1:3000/robots.txt http://127.0.0.1:3000/shop I would like to then dump the output to file, so I do this: wget -r -nv http://127.0.0.1:3000/test.html 2>&1 | grep --line-buffered -v ERROR | sed 's/^.*URL:\([^ ]*\).*/\1/g' > /tmp/DUMP_FILE I interrupt the process after a few seconds and check the file, yet it is empty. Interesting, the following yields no output (same as above, but piping sed output through cat): wget -r -nv http://127.0.0.1:3000/test.html 2>&1 | grep --line-buffered -v ERROR | sed 's/^.*URL:\([^ ]*\).*/\1/g' | cat Why can I not pipe the output of sed to another program like cat?

    Read the article

  • input of while loop to come from output of `command`

    - by Felipe Alvarez
    #I used to have this, but I don't want to write to the disk # pcap="somefile.pcap" tcpdump -n -r $pcap > all.txt while read line; do ARRAY[$c]="$line" c=$((c+1)) done < all.txt The following fails to work. # I would prefer something like... # pcap="somefile.pcap" while read line; do ARRAY[$c]="$line" c=$((c+1)) done < $( tcpdump -n -r "$pcap" ) Too few results on Google (doesn't understand what I want to find :( ). I'd like to keep it Bourne-compatible (/bin/sh), but it doesn't have to be.

    Read the article

  • Anyone have a database of file extensions & icons to go with the extensions?

    - by neddy
    Ok, im developing some software which requires file icons to display lists of files on a computer... i don't want to use the system ExtractAssociatedIcon api's i'd rather load the icons for the file extensions out of a database... (as some systems may not have certain files associated etc)... Does anyone have a database of file extensions & icons to go with the extensions that i can use? Cheers in advance,

    Read the article

  • How can I get the associated ref path for a git SHA?

    - by andreb
    Hi, I want to be able to pass anything to a git command (maybe its a SHA, maybe it's just something like "origin/master" or "devel/epxerimental" etc.) and git tells me the ref path of the branch that the passed something lives in, e.g. <git_command> 0dc27819b8e9 => output: refs/heads/master <git_command> xyz/test => output: refs/remotes/xyz/master ... I've been looking at git show or git log or git rev-parse and apart from --pretty=format:%d I couldn't find anything. (--pretty=format:%d output is quite strange with lotsa free space and empty lines and sometimes more than one ref paths are on one line bunched together). There has to be a better way? Thanks for reading. Andre

    Read the article

  • how to add a function to that program, and call that function from the command line in the function

    - by user336291
    a#include "smallsh.h" /*include file for example*/ /*program buffers and work pointers*/ static char inpbuf[MAXBUF], tokbuf[2*MAXBUF], *ptr = inpbuf, *tok = tokbuf; userin(p) /*print prompt and read a line*/ char *p; { int c, count; /*initialization for later routines*/ ptr = inpbuf; tok = tokbuf; /*display prompt*/ printf("%s ",p); for(count = 0;;) { if((c = getchar()) == EOF) return(EOF); if(count<MAXBUF) inpbuf[count++] = c; if(c == '\n' && count <MAXBUF) { inpbuf[count] = '\0'; return(count); } /*if line too long restart*/ if(c == '\n') { printf("smallsh:input line too long\n"); count = 0; printf("%s",p); } } } gettok(outptr) /*get token and place into tokbuf*/ char **outptr; { int type; *outptr = tok; /*strip white space*/ for(;*ptr == ' ' || *ptr == '\t'; ptr++) ; *tok++ = *ptr; switch(*ptr++) { case '\n': type = EOL; break; case '&': type = AMPERSAND; break; case ';': type = SEMICOLON; break; case '#': type = POUND; break; default: type = ARG; while(inarg(*ptr)) *tok++ = *ptr++; } *tok++ = '\0'; return(type); } static char special[]= {' ', '\t', '&', ':', '\n', '\0'}; inarg(c) /*are we in an ordinary argument*/ char c; { char *wrk; for(wrk = special;*wrk != '\0';wrk++) if(c == *wrk) return(0); return(1); } #include "smallsh.h" procline() /*process input line*/ { char *arg[MAXARG+1]; /*pointer array for runcommand*/ int toktype; /*type of token in command*/ int narg; /*number of arguments so far*/ int type; /*FOREGROUND or BACKGROUND*/ for(narg = 0;;) { /*loop FOREVER*/ /*take action according to token type*/ switch(toktype = gettok(&arg[narg])) { case ARG: if(narg<MAXARG) narg++; break; case EOL: case SEMICOLON: case AMPERSAND: case POUND: type = (toktype == AMPERSAND) ? BACKGROUND : FOREGROUND; if(narg!=0) { arg[narg] = NULL; runcommand(arg, type); } if((toktype == EOL)||(toktype=POUND)) return; narg = 0; break; } } } #include "smallsh.h" /*execute a command with optional wait*/ runcommand(cline,where) char **cline; int where; { int pid, exitstat, ret; if((pid = fork()) <0) { perror("smallsh"); return(-1); } if(pid == 0) { /*child*/ execvp(*cline, cline); perror(*cline); exit(127); } /*code for parent*/ /*if background process print pid and exit*/ if(where == BACKGROUND) { printf("[Process id %d]\n", pid); return(0); } /*wait until process pid exists*/ while( (ret=wait(&exitstat)) != pid && ret != -1) ; return(ret == -1 ? -1 : exitstat); } #include "smallsh.h" char *prompt = "Command>"; /*prompt*/ main() { while(userin(prompt) != EOF) procline(); }

    Read the article

  • ksh: Iterate through a range

    - by sgreeve
    How can I iterate through a simple range of ints using a for loop in ksh? For example, my script currently does this... for i in 1 2 3 4 5 6 7 do #stuff done ...but I'd like to extend the range way above 7. Is there a better syntax? Thanks!

    Read the article

  • Iterating over each line of ls -l output

    - by Ivan
    I want to iterate over each line in the output of ls -l /some/dir/* Right now I'm trying: for x in ls -l $1; do echo $x done, however this iterates over each element in the line seperately, so i get -r--r----- 1 ivanevf eng 1074 Apr 22 13:07 File1 -r--r----- 1 ivanevf eng 1074 Apr 22 13:17 File2 I want to iterate over each line as a whole, though. How do I do that? Thanks.

    Read the article

  • echo value inside a variable ?

    - by Kimi
    x=102 y=x means when i echo $y it gives x echo $y x --and not 102 and when i echo $x it give 102 lets say I dnt know what is inside y and i want the value of x to be echoed with using y someting like this a=`echo $(echo $y)` echo $a Ans 102

    Read the article

  • GUNZIP / Extract file "portion by portion"

    - by Dave
    Hi. I'm on a shared server with restricted disk space and i've got a gz file that super expands into a HUGE file, more than what i've got. How can I extract it "portion" by "portion (lets say 10 MB at a time), and process each portion, without extracting the whole thing even temporarily! No, this is just ONE super huge compressed file, not a set of files please...

    Read the article

  • bash "map" equivalent: run command on each file

    - by Claudiu
    I often have a command that processes one file, and I want to run it on every file in a directory. Is there any built-in way to do this? For example, say I have a program data which outputs an important number about a file: ./data foo 137 ./data bar 42 I want to run it on every file in the directory in some manner like this: map data `ls *` ls * | map data to yield output like this: foo: 137 bar: 42

    Read the article

  • script to delete all /n number of lines starting from a word except last line

    - by akvikram
    how to delete all lines below a word except last line in a file. suppose i have a file which contains | 02/04/2010 07:24:20 | 20-24 | 26 | 13 | 2.60 | | 02/04/2010 07:24:25 | 25-29 | 6 | 3 | 0.60 | +---------------------+-------+------------+----------+-------------+ 02-04-2010-07:24 --- ER GW 03 +---------------------+-------+------------+----------+-------------+ | date | sec | BOTH_MO_MT | MO_or_MT | TPS_PER_SEC | +---------------------+-------+------------+----------+-------------+ | 02/04/2010 07:00:00 | 00-04 | 28 | 14 | 2.80 | | 02/04/2010 07:00:05 | 05-09 | 27 | 14 | 2.70 | ... ... ... ... END OF TPS PER 5 REPORT and i need to delete all contents from "02-04-2010-07:24 --- ER GW 03" except "END OF TPS PER 5 REPORT" and save the file. This has to be done for around 700 files. all files are same format, with datemonthday filename.

    Read the article

  • What is the difference between "su --command" and "su --session-command"?

    - by oliver
    Running # su - oliver --command bash gives a shell but also prints the warning bash: no job control in this shell, and indeed Ctrl+Z and fg/bg don't work in that shell. Running # su - oliver --session-command bash gives a shell without printing the warning, and job control indeed works. The suggestion to use --session-command comes from Starting a shell from scripts using su results in "no job control in this shell" which states "[a security fix for su] changed the behavior of the -c option and disables job control inside the called shell". But I still don't quite understand this. When should one use --command and when should one use --session-command? Is --command (aka -c) more secure? Or should one always use --session-command, and --command is just left in for backwards compatibility? FWIW, I'm using RHEL 6.4.

    Read the article

  • Mac Terminal: changed my shell, now can't start it

    - by kch
    I installed bash 4.0 via MacPorts, then used sudo chsh -s /opt/local/bin/bash my_user to change my shell. Before that I tried just running plain chsh without sudo, but it wouldn't allow me to change my shell to that path. Now when I try to start Terminal I'm getting a message that my shell has an illegal value, so Terminal won't start. I click Quit, and, unsurprisingly but annoyingly, it quits immediately. How do I reset my shell so I can start Terminal again? How do I set my shell to bash installed via MacPorts in a way that it'll work? Why does Terminal think my shell is illegal anyway? Is it siding with the neo-prohibitionists? Mac OS X 10.5.8. Everything super mega up-to-date.

    Read the article

  • How do I get a Mac ".command" file to automatically quit after running a shell script?

    - by LOlliffe
    In my shell script, my last lines are: ... echo "$l" done done exit I have Terminal preference set to "When the shell exits: Close the window". In all other cases, when I type "exit" or "logout", in Terminal, the window closes, but for this ".command" file (I can double-click on my shell script file, and the script runs), instead of closing the window, while the file's code says "exit", what shows on the screen is: ... $l done logout [Process completed] ...and the window remains open. Does anyone know how to get a shell script to run, and then just automatically quit the Terminal window on completion? Thanks!

    Read the article

  • The Case for Gnome Shell

    <b>WorksWithU:</b> "A couple weeks ago, I wrote some posts on GNOME Shell which included a number of criticisms of the desktop environment that will likely become Ubuntu'S default at some point in the future. Jon McCann, lead designer for GNOME Shell, recently got in touch to offer his responses to the problems I found with the new interface"

    Read the article

  • How to use gestures on gnome shell?

    - by Mauricio Andrés
    I have installed Ubuntu 12.10 on my Acer Apsire V5 touch, and I want to know hot to activate the gestures on my clickpad in Gnome SHell, because in Unity I can use them with no problems. So, is there a way to use the gestures on Gnome Shell? Gestures: Pinch to Zoom Move windows with 3 fingers "Natural scroll" etc... And if it's possible to fully use the touch screen, because it fails when I do a click on it.

    Read the article

< Previous Page | 22 23 24 25 26 27 28 29 30 31 32 33  | Next Page >