Search Results

Search found 4967 results on 199 pages for 'bourne shell'.

Page 65/199 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • Write STDOUT & STDERR to a logfile, also write STDERR to screen

    - by Stefan Lasiewski
    I would like to run several commands, and capture all output to a logfile. I also want to print any errors to the screen (or optionally mail the output to someone). Here's an example. The following command will run three commands, and will write all output (STDOUT and STDERR) into a single logfile. { command1 && command2 && command3 ; } > logfile.log 2>&1 Here is what I want to do with the output of these commands: STDERR and STDOUT for all commands goes to a logfile, in case I need it later--- I usually won't look in here unless there are problems. Print STDERR to the screen (or optionally, pipe to /bin/mail), so that any error stands out and doesn't get ignored. It would be nice if the return codes were still usable, so that I could do some error handling. Maybe I want to send email if there was an error, like this: { command1 && command2 && command3 ; } logfile.log 2&1 || mailx -s "There was an error" [email protected] The problem I run into is that STDERR loses context during I/O redirection. A '2&1' will convert STDERR into STDOUT, and therefore I cannot view errors if I do 2 error.log Here are a couple juicier examples. Let's pretend that I am running some familiar build commands, but I don't want the entire build to stop just because of one error so I use the '--keep-going' flag. { ./configure && make --keep-going && make install ; } > build.log 2>&1 Or, here's a simple (And perhaps sloppy) build and deploy script, which will keep going in the event of an error. { ./configure && make --keep-going && make install && rsync -av --keep-going /foo devhost:/foo} > build-and-deploy.log 2>&1 I think what I want involves some sort of Bash I/O Redirection, but I can't figure this out.

    Read the article

  • Replace delimited block of text in file with the contents of another file

    - by rmarimon
    I need to write a simple script to replace a block of text in a configuration file with the contents of another file. Let's assume with have the following simplified files: server.xml <?xml version='1.0' encoding='UTF-8'?> <Server port="8005" shutdown="SHUTDOWN"> <Service name="Catalina"> <Connector port="80" protocol="HTTP/1.1"/> <Engine name="Catalina" defaultHost="localhost"> <!-- BEGIN realm --> <sometags/> <sometags/> <!-- END realm --> <Host name="localhost" appBase="webapps"/> </Engine> </Service> </Server> realm.xml <Realm className="org.apache.catalina.realm.UserDatabaseRealm" resourceName="UserDatabase"/> I want to run a script and have realm.xml replace the contents between the <!-- BEGIN realm --> and <!-- END realm --> lines. If realm.xml changes then whenever the script is run again it will replace the lines again with the new contents of realm.xml. This is intended to be run in /etc/init.d/tomcat on startup of the service on multiple installations on which the realm is going to be different. I'm not so sure how can I do this simply with awk or sed.

    Read the article

  • Dymanic if statement evaluation problem with string comparison

    - by Mani
    I tried the example given in http://forums.sun.com/thread.jspa?threadID=780576&tstart=67605 to create if statement dynamically. But it is not working fine. Instead of using "age" variable as integer, i have used string in the below example. I am getting "fail" as answer instead of "success". Can anyone help me? / To change this template, choose Tools | Templates and open the template in the editor. / import java.lang.reflect.*; import bsh.Interpreter; public class Main { public static String d; public static void main(String args[]) { try { String age = "30"; String cond = "age==30"; Interpreter i = new Interpreter(); i.set("age", age); System.out.println(" sss" + i.get("age")); if((Boolean)i.eval(cond)) { System.out.println("success"); } else { System.out.println("fail"); } } catch (Throwable e) { System.err.println(e); } } } Thanks, Mani

    Read the article

  • UI Controls layer on top of operating system.

    - by Mason Blier
    I'm kind of curious about what layer writing a UI platform to the level of Win32 or the X Windowing System would fall in the grand scheme of an operating system. What layers below do they primarily make use of, is it heavily based on direct communication with the graphics card driver (I can't imagine going though a rendering pipeline like OpenGL for this), or is there a graphical platform as part of the operating system which extracts this out a little more. I'm also interested in the creation of shells and the like, and I"m particularly curious as to how people go about creating alternative shells for windows, what do people look for when figuring out what methods to call or what to hook into, etc? I guess I'm fairly lost at these concepts and finding it difficult to find documentation on them. I was initially excited to have taken Operating Systems in college but it was all low level resource management stuff. Thanks all, Mason

    Read the article

  • ffmpeg screen capture

    - by Mirai
    I wrote this script for some basic screen capture; it gets the window dimensions then uses the ffmpeg binary to record. I suspect there is a better way (maybe with the ffmpeg library), but scripting is what I know and ffmpeg generally works. Any software (other than recordmydesktop), or improvements to the script are welcome. info=`xwininfo -frame` H=`echo "$info" | grep Height | sed -E "s/^.*: ([[:digit:]]+)$/\1/"` W=`echo "$info" | grep Width | sed -E "s/^.*: ([[:digit:]]+)$/\1/"` offset=:0.0+`echo "$info" | grep Corners | sed -E "s/^.*:[[:space:]]+\+([[:digit:]]+\+[[:digit:]]+)[[:space:]]+.+/\1/" | tr + ,` /usr/local/bin/ffmpeg -f x11grab -s ${W}x${H} -r 45 -i $offset -sameq -f avi ~/videos/`date +%Y-%m-%d-%H%M%s`_vid & echo $! > /tmp/$(basename $0)-$USER

    Read the article

  • Filtering Filenames with bash

    - by Stefan Liebenberg
    I have a directory full of log files in the form ${name}.log.${year}{month}${day} such that they look like this: logs/ production.log.20100314 production.log.20100321 production.log.20100328 production.log.20100403 production.log.20100410 ... production.log.20100314 production.log.old I'd like to use a bash script to filter out all the logs older than x amount of month's and dump it into *.log.old X=6 #months LIST=*.log.*; for file in LIST; do is_older = file_is_older_than_months( ${file}, ${X} ); if is_older; then cat ${c} >> production.log.old; rm ${c}; fi done; How can I get all the files older than x months? and... How can I avoid that *.log.old file is included in the LIST attribute? Thank you Stefan

    Read the article

  • nested if: too many arguments?

    - by FLX
    For some reason this code creates problems: source="/foo/bar/" destination="/home/oni/" if [ -d $source ]; then echo "Source directory exists" if [ -d $destination ]; then echo "Destination directory exists" rsync -raz --delete --ignore-existing --ignore-times --size-only --stats --progress $source $destination chmod -R 0755 $destination else echo "Destination directory does not exists" fi else echo "Source directory does not exists" fi It errors out with: Source directory exists /usr/bin/copyfoo: line 7: [: too many arguments Destination directory does not exists I used nested if statements in bash before without a problem, what simple mistake am I overlooking? Thanks!

    Read the article

  • Assigning keyboard shortcut to get path of selected item in windows explorer

    - by Juha
    I don't know if this is even possible, but how can I bind some key combination to a (C#)program, so that when that keyboard shortcut is pressed with some file selected in windows explorer, it calls specific function with path of that file as a parameter. Or can I assign some keyboard shortcut so that windows explorer opens selected file in my program(that way I could pass the path to already running instance) thanks

    Read the article

  • Set environment variables using SSH

    - by Kunal
    Hello, I am trying to execute unix command using SSH from cygwin. My set of command would navigate to a certain directory, source a particular file. Based on the variables sourced from that file, I would try to launch application. But somehow the variables are not getting sourced as echo does not return any values. Could someone let me know what am I missing here Contents of the environment variables file (myenv) are export TEST_DATA="DATA1:DATA2" and I am executing the following command $ ssh kunal@kspace "ls; cd /disk1/kunal/env; . ./myenv; echo $TEST_DATA; "

    Read the article

  • Safe way to set computed environment variables

    - by sfink
    I have a bash script that I am modifying to accept key=value pairs from stdin. (It is spawned by xinetd.) How can I safely convert those key=value pairs into environment variables for subprocesses? I plan to only allow keys that begin with a predefined prefix "CMK_", to avoid IFS or any other "dangerous" variable getting set. But the simplistic approach function import () { local IFS="=" while read key val; do case "$key" in CMK_*) eval "$key=$val";; esac done } is horribly insecure because $val could contain all sorts of nasty stuff. This seems like it would work: shopt -s extglob function import () { NORMAL_IFS="$IFS" local IFS="=" while read key val; do case "$key" in CMK_*([a-zA-Z_]) ) IFS="$NORMAL_IFS" eval $key='$val' IFS="=" ;; esac done } but (1) it uses the funky extglob thing that I've never used before, and (2) it's complicated enough that I can't be comfortable that it's secure. My goal, to be specific, is to allow key=value settings to pass through the bash script into the environment of called processes. It is up to the subprocesses to deal with potentially hostile values getting set. I am modifying someone else's script, so I don't want to just convert it to Perl and be done with it. I would also rather not change it around to invoke the subprocesses differently, something like #!/bin/sh ...start of script... perl -nle '($k,$v)=split(/=/,$_,2); $ENV{$k}=$v if $k =~ /^CMK_/; END { exec("subprocess") }' ...end of script...

    Read the article

  • creating a .sh file programmatically under windows and running it on a linux system from cygwin

    - by user1296193
    i want to write program, in windows, which will write a .sh file, then connect to a linux machine via cygwin and ssh, and execute that .sh file. I've had to use dos2unix to convert .sh files that I created in windows to run under linux. Obviously if I am executing a script with ssh it will have to be linux formatted to work. so I need to know how to create a linux appropriate .sh file using c or openoffice basic or vBA. thanks!

    Read the article

  • Cygwin bash syntax error - but script run perfectly well in Ubuntu

    - by Michael Mao
    #!/bin/bash if test "$#" == "4"; then echo "$*"; else echo "args-error" >&2; fi; This little code snippet troubles me a lot when I tried to run it on both Ubuntu and Cygwin. Ubuntu runs bash version 4.0+ whereas Cygwin runs 3.2.49; But I reckon version collision shall not be the cause of this, this code runs well under fedora 10 which is also using bash version 3.+ So basically I am wondering if there is a way to code my script once and for all so there are not to have this awful issue later on. Many thanks in advance.

    Read the article

  • process killed -- delete output file?

    - by user151841
    I have a bash script that runs on our shared web host. It does a dump of our mysql database and zips up the output file. Sometimes the mysqldump process gets killed, which leaves an incomplete sql file that still gets zipped. How do I get my script to 'notice' the killing and then delete the output file if the killing occurred?

    Read the article

  • How do I use a variable argument number in a bash script?

    - by Corbin Tarrant
    #!/bin/bash # Script to output the total size of requested filetype recursively # Error out if no file types were provided if [ $# -lt 1 ] then echo "Syntax Error, Please provide at least one type, ex: sizeofTypes {filetype1} {filetype2}" exit 0 fi #set first filetype types="-name *."$1 #loop through additional filetypes and append num=1 while [ $num -lt $# ] do (( num++ )) types=$types' -o -name *.'$$num done echo "TYPES="$types find . -name '*.'$1 | xargs du -ch *.$1 | grep total The problem I'm having is right here: #loop through additional filetypes and append num=1 while [ $num -lt $# ] do (( num++ )) types=$types' -o -name *.'>>$$num<< done I simply want to iterate over all the arguments not including the first one, should be easy enough, but I'm having a difficult time figuring out how to make this work

    Read the article

  • iteratively creating graphs

    - by Andrei
    I have a bunch of files containing x and y coordinates, representing time and value (space-separated, but can be amended) For example: 15:06:59 0.0140 ....... I want to create a word file (or some equivalent) to show all these graphs. Right now I am using Excel. It pretty daunting task, as I ahve to plug paste numbers in two rows for each graph, and I have many of them. Thanks

    Read the article

  • Messy bash variable

    - by Kyle
    I'm writing a script to ssh in to a list of machines and compare a variable to another value.. I've run into a problem (I have a couple workarounds, but at this point I'm just wondering why this method isn't working). VAR=ssh $i "awk -F: '/^bar/ {print \$2}' /local/foo.txt" ($i would be a hostname. The hosts are trusted, no password prompt is given) Example of foo.txt: foo:123456:abcdef bar:789012:ghijkl baz:345678:mnopqr I'm assuming it's a problem with quotes, or \'s needed somewhere. I've tried several methods (different quoting, using $() instead of ``, etc) but can't seem to get it right. My script is working correctly using the following: VAR=ssh $i "grep bar /local/foo.txt" | awk -F: '{print \$2}' Like I said, just a curiousity, any response is appreciated.

    Read the article

  • Identifying and removing null characters in UNIX

    - by fahdshariff
    I have a text file containing unwanted null characters. When I try to view it in I see ^@ symbols, interleaved in normal text. How can I: a) Identify which lines in the file contains null characters? I have tried grepping for \0 and \x0, but this did not work. b) Remove the null characters? Running strings on the file cleaned it up, but I'm just wondering if this is the best way? Thanks

    Read the article

  • How do I use the sed command to remove all lines between 2 phrases (including the phrases themselves

    - by fzkl
    I am generating a log from which I want to remove X startup output which looks like this: X.Org X Server 1.7.6 Release Date: 2010-03-17 X Protocol Version 11, Revision 0 Build Operating System: Linux 2.6.31-607-imx51 armv7l Ubuntu Current Operating System: Linux nvidia 2.6.33.2 #1 SMP PREEMPT Mon May 31 21:38:29 PDT 2010 armv7l Kernel command line: mem=448M@0M nvmem=64M@448M mem=512M@512M chipuid=097c81c6425f70d7 vmalloc=320M video=tegrafb console=ttyS0,57600n8 usbcore.old_scheme_first=1 tegraboot=nand root=/dev/nfs ip=:::::usb0:on rw tegra_ehci_probe_delay=5000 smp dvfs tegrapart=recovery:1b80:a00:800,boot:2680:1000:800,environment:3780:40:800,system:38c0:2bc00:800,cache:2f5c0:4000:800,userdata:336c0:c840:800 envsector=3080 Build Date: 23 April 2010 05:19:26PM xorg-server 2:1.7.6-2ubuntu7 (Bryce Harrington <[email protected]>) Current version of pixman: 0.16.4 Before reporting problems, check http://wiki.x.org to make sure that you have the latest version. Markers: (--) probed, (**) from config file, (==) default setting, (++) from command line, (!!) notice, (II) informational, (WW) warning, (EE) error, (NI) not implemented, (??) unknown. (==) Log file: "/var/log/Xorg.0.log", Time: Wed Jun 16 19:52:00 2010 (==) Using config file: "/etc/X11/xorg.conf" (==) Using config directory: "/usr/lib/X11/xorg.conf.d" Is there any way to do this without manually checking pattern for each line?

    Read the article

  • bash: listing files in date order, with spaces in filenames

    - by Jason Judge
    I am starting with a file containing a list of hundreds of files (full paths) in a random order. I would like to list the details of the ten latest files in that list. This is my naive attempt: ls -las -t `cat list-of-files.txt` | head -10 That works, so long as none of the files have spaces in, but fails if they do as those files are split up at the spaces and treated as separate files. I have tried quoting the files in the original list-of-files file, but the here-document still splits the files up at the spaces in the filenames. The only way I can think of doing this, is to ls each file individually (using xargs perhaps) and create an intermediate file with the file listings and the date in a sortable order as the first field in each line, then sort that intermediate file. However, that feels a bit cumbersome and inefficient (hundreds of ls commands rather than one or two). But that may be the only way to do it? Is there any way to pass "ls" a list of files to process, where those files could contain spaces - it seems like it should be simple, but I'm stumped.

    Read the article

  • Iterate through XML with xmlstarlet

    - by hendry
    I have the following XML: <?xml version="1.0" encoding="UTF-8"?> <test-report> <testsuite> <test name="RegisterConnection1Tests"> <testcase name="testRregisterConnection001"></testcase> <testcase name="testRegisterConnection002"></testcase> </test> <test name="RegisterConnection2Tests"> <testcase name="testRregisterConnection001"></testcase> <testcase name="testRegisterConnection002"></testcase> </test> </testsuite> </test-report> And I want the output: RegisterConnection1Tests,testRregisterConnection001 RegisterConnection1Tests,testRregisterConnection002 RegisterConnection2Tests,testRregisterConnection001 RegisterConnection2Tests,testRregisterConnection002 I'm confused as to how to show the children as I expected xmlstarlet sel -t -m 'test-report/testsuite/test' -v '@name' -v '//testcase/@name' -n $1 to work, though it only inputs: RegisterConnection1TeststestRregisterConnection001 RegisterConnection2TeststestRregisterConnection001

    Read the article

  • How to pass parameters to a Linux Bash script?

    - by chun
    I have a Linux bash script 'myshell'. I want it to read two dates as parameters, for example: myshell date1 date2. I am a Java programmer, but don't know how to write a script to get this done. The rest of the script is like this: sed "s/$date1/$date2/g" wlacd_stat.xml >tmp.xml mv tmp.xml wlacd_stat.xml

    Read the article

  • Python IDE on Linux Console

    - by Henrik P. Hessel
    This may sound strange, but I need a better way to build python scripts than opening a file with nano/vi, change something, quit the editor, and type in python script.py, over and over again. I need to build the script on a webserver without any gui. Any ideas how can I improve my workflow?

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >