Search Results

Search found 15674 results on 627 pages for 'bash date'.

Page 29/627 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • List the names of existing directories from .tgz file in a bash variable

    - by Tom
    I would like to find all the directories that are in a .tgz file and that already exist on the system and put the result in a bash variable. I have tried this: EXISTING=`for f in \`tar tzf $ARCHIVE\`; do if [ -d "/tmp/unpacked-data/\$f" ]; then echo \$f; fi; done` with no luck. If I echo the value of $f before the if in the loop, I get all the files, ie this works: EXISTING=`for f in \`tar tzf $ARCHIVE\`; do echo \$f; done` Can someone tell me why the \$f doesn't work in the if statement? Thanks, Tom

    Read the article

  • Remove first element from $@ in bash

    - by Herms
    I'm writing a bash script that needs to loop over the arguments passed into the script. However, the first argument shouldn't be looped over, and instead needs to be checked before the loop. If I didn't have to remove that first element I could just do: for item in "$@" ; do #process item done I could modify the loop to check if it's in its first iteration and change the behavior, but that seems way too hackish. There's got to be a simple way to extract the first argument out and then loop over the rest, but I wasn't able to find it.

    Read the article

  • Sed not working inside bash script

    - by Isabelle
    Hello. I believe this may be a simple question, but I've looked everywhere and tried some workarounds, but I still haven't solved the problem. Problem description: I have to replace a character inside a file and I can do it easily using the command line: sed -e 's/pattern1/pattern2/g' full_path_to_file/file But when I use the same line inside a bash script I can't seem to be able to replace it, and I don't get an error message, just the file contents without the substitution. #!/bin/sh VAR1="patter1" VAR2="patter2" VAR3="full_path_to_file" sed -e 's/${VAR1}/${VAR2}/g' ${VAR3} Any help would be appreciated. Thank you very much for your time.

    Read the article

  • Use a grepped file as an included source in bash

    - by Andrew
    I'm on a shared webhost where I don't have permission to edit the global bash configuration file at /ect/bashrc. Unfortunately there is one line in the global file, mesg y, which puts the terminal in tty mode and makes scp and similar commands unavailable. My local ~./bashrc includes the global file as a source, like so: # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi My current workaround uses grep to output the global file, sans offending line, into a local file and use that as a source. # Source global definitions if [ -f /etc/bashrc ]; then grep -v mesg /etc/bashrc > ~/.bash_global . ~/.bash_global fi Is there a way to do include a grepped file like this without the intermediate step of creating an actual file? Something like this? . grep -v mesg /etc/bashrc > ~/.bash_global

    Read the article

  • cURL: from PHP to BASH

    - by flienteen
    Hi.. I've never done any curl before so am in need of some help. php: <?php $ch = curl_init(); $data = array( 'uptype'=>'file', 'file'=>'@'.$argv[1], ); curl_setopt($ch, CURLOPT_URL, 'http://my_site_ex/up.php'); curl_setopt($ch, CURLOPT_POST, 1); curl_setopt($ch, CURLOPT_POSTFIELDS, $data); curl_exec($ch); curl_close($ch); ?> how to make the same script in BASH?

    Read the article

  • mkdir error in bash script

    - by Don
    Hi, The following is a fragment of a bash script that I'm running under cygwin on Windows: deployDir=/cygdrive/c/Temp/deploy timestamp=`date +%Y-%m-%d_%H:%M:%S` deployDir=${deployDir}/$timestamp if [ ! -d "$deployDir" ]; then echo "making dir $deployDir" mkdir -p $deploydir fi This produces output such as: making dir /cygdrive/c/Temp/deploy/2010-04-30_11:47:58 mkdir: missing operand Try `mkdir --help' for more information. However, if I type /cygdrive/c/Temp/deploy/2010-04-30_11:47:58 on the command-line it succeeds, why does the same command not work in the script? Thanks, Don

    Read the article

  • Automatic exit from bash shell script on error

    - by radman
    Hi, I've been writing some shell script and I would find it useful if there was the ability to halt the execution of said shell script if any of the commands failed. See below for an example: #!/bin/bash cd some_dir ./configure --some-flags make make install So in this case if the script can't change to the indicated directory then it would certainly not want to do a ./configure afterward it fails. Now I'm well aware that I could have an if check for each command (which I think is a hopeless solution), but is there a global setting to make the script exit if one of the commands fails?

    Read the article

  • bash command history update before execution of command

    - by Jon
    Hi, Bash's command history is great, especially it is useful when adding the history -a command to the COMMAND_PROMPT. However, I'm wondering if there is a way to log the commands to a file as soon as the Return key is pressed, e.g. before starting the command and not on completion of the command (using the COMMAND_PROMPT option would save the command once the prompt is there again). I read about auditing programs like snoopy and session recorder like script but I thought they're already too complex for the simple question I have. I guess that deactivating that script logs all the output of the command would lead already in the right direction but isn't there a quicker way to solve that probelm? Thanks, Jon

    Read the article

  • Parsing getopts in bash

    - by ABach
    I've got a bash function that I'm trying to use getopts with and am having some trouble. The function is designed to be called by itself (getch), with an optional -s flag (getch -s), or with an optional string argument afterward (so getch master and getch -s master are both valid). The snippet below is where my problem lies - it isn't the entire function, but it's what I'm focusing on: getch() { if [ "$#" -gt 2 ] || [ "$1" = "-h" ] || [ "$1" = "--help" ]; then echo "Usage: $0 [-s] [branch-name]" >&2 return 1 fi while getopts "s" opt; do echo $opt # This line is here to test how many times we go through the loop case $opt in s) squash=true shift ;; *) ;; esac done } The getch -s master case is where the strangeness happens. The above should spit out s once, but instead, I get this: [user@host:git-repositories/temp]$ getch -s master s s [user@host:git-repositories/temp]$ Why is it parsing the -s opt twice?

    Read the article

  • AWS free tier "sign up date" vs "credit card details submission date"

    - by Mayur Rokade
    I am worried about my account expiry date. I created an account on AWS in July 2013 and submitted my credit card details on 31st Oct 2013. I went in Billing Management Console/Bills section where when I click on Date, I can see months ranging from July 2013 to Nov 2013. From AWS FAQs I gathered When does the AWS free usage tier expire? The AWS free usage tier will expire 12 months from the date you sign up. So WHEN will my account expire, July 2014 (sign up date) or Oct 2014 (credit card details submission date) ?

    Read the article

  • Bash or python for changing spacing in files

    - by Werner
    Hi, I have a set of 10000 files. In all of them, the second line, looks like: AAA 3.429 3.84 so there is just one space (requirement) between AAA and the two other columns. The rest of lines on each file are completely different and correspond to 10 columns of numbers. Randomly, in around 20% of the files, and due to some errors, one gets BBB 3.429 3.84 so now there are two spaces between the first and second column. This is a big error so I need to fix it, changing from 2 to 1 space in the files where the error takes place. The first approach I thought of was to write a bash script that for each file reads the 3 values of the second line and then prints them with just one space, doing it for all the files. I wonder what do oyu think about this approach and if you could suggest something better, bashm python or someother approach. Thanks

    Read the article

  • Compare output of program to correct program using bash script, without using text files

    - by Doug
    I've been trying to compare the output of a program to known correct output by using a bash script without piping the output of the program to a file and then using diff on the output file and a correct output file. I've tried setting the variables to the output and correct output and I believe it's been successful but I can't get the string comparison to work correctly. I may be wrong about the variable setting so it could be that. What I've been writing: TEST=`./convert testdata.txt < somesampledata.txt` CORRECT="some correct output" if [ "$TEST"!="$CORRECT" ]; then echo "failed" fi

    Read the article

  • Bash Script using Grep to search for a pattern in a file

    - by atif089
    I am writing a bash script to search for a pattern in a file using GREP. I am clueless for why it isnt working. This is the program echo "Enter file name..."; read fname; echo "Enter the search pattern"; read pattern if [ -f $fname ]; then result=`grep -i '$pattern' $fname` echo $result; fi Or is there different approach to do this ? Thanks

    Read the article

  • Bash script function return value problem

    - by Eedoh
    Hi to all. Can anyone help me return the correct value from a bash script function? Here's my function that should return first (and only) line of the file passed as an argument: LOG_FILE_CREATION_TIME() { return_value=`awk 'NR==1' $1` return return_value } And here's my call of that function in the other script: LOG_FILE_CREATION_TIME "logfile" timestamp=$? echo "Timestamp = $timestamp" I always get some random values with this code. If, for example, there's a value of 62772031 in the "logfile", I get Timestamp = 255 as an output. For some other values in the file, I get other random values as a return value, never the correct one. Any ideas?

    Read the article

  • Convert PHP date into javascript date format

    - by LeeTee
    I have a PHP script that outputs an array of data. This is then transformed into JSON using the json_encode() function. My issue is I have a date within my array and its is not in the correct javascript format. How can I convert this within PHP so it is? $newticket['ThreadID'] = $addticket; $newticket['Subject'] = $subject; //$newticket['DateCreated'] = date('d-m-Y G:H'); Instead of the above fo rthe date I need the equivilant of the javascript function new Date() When I output the above I get the following "Fri Jun 01 2012 11:08:48 GMT+0100 (GMT Daylight Time)" However, If I format my PHP date to be the same, javascript rejects it. Confused... Can anyone help?

    Read the article

  • Bash variable kills script execusion

    - by Kyle Terry
    Sorry if this is better suited at serverfault, but I think it learns more towards the programming side of things. I have some code that's going into /etc/rc.local to detect what type of touch screen monitor is plugged in and changes out the xorg.conf before launching X. Here is a small snippet: CURRENT_MONITOR=`ls /dev/usb | grep 'egalax_touch\|quanta_touch'` case $CURRENT_MONITOR in '') CURRENT_MONITOR='none' ;; esac If one of those two touch screens is plugged in, it works just fine. If any other monitor is plugged in, it stops at the "CURRENT_MONITOR=ls /dev/usb | grep 'egalax_touch\|quanta_touch'." For testing I touched two files. One before creating CURRENT_MONITOR and one after CURRENT_MONITOR and only file touched before is created. I'm not a bash programmer so this might be something very obvious.

    Read the article

  • Bash variable kills script execution

    - by Kyle Terry
    Sorry if this is better suited at serverfault, but I think it learns more towards the programming side of things. I have some code that's going into /etc/rc.local to detect what type of touch screen monitor is plugged in and changes out the xorg.conf before launching X. Here is a small snippet: CURRENT_MONITOR=`ls /dev/usb | grep 'egalax_touch\|quanta_touch'` case $CURRENT_MONITOR in '') CURRENT_MONITOR='none' ;; esac If one of those two touch screens is plugged in, it works just fine. If any other monitor is plugged in, it stops at the "CURRENT_MONITOR=ls /dev/usb | grep 'egalax_touch\|quanta_touch'." For testing I touched two files. One before creating CURRENT_MONITOR and one after CURRENT_MONITOR and only file touched before is created. I'm not a bash programmer so this might be something very obvious.

    Read the article

  • Intersection of two lists in Bash

    - by User1
    I'm trying to write a simple script that will list the contents found in two lists. To simplify, let's use ls as an example. Imagine "one" and "two" are directories. one=`ls one` two=`ls two` intersection $one $two I'm still quite green in bash, so feel free to correct how I am doing this. I just need some command that will print out all files in "one" and "two". They must exist in both. You might call this the "intersection" between "one" and "two".

    Read the article

  • Parallelize Bash Script

    - by thelsdj
    Lets say I have a loop in bash: for foo in `some-command` do do-something $foo done do-something is cpu bound and I have a nice shiny 4 core processor. I'd like to be able to run up to 4 do-something's at once. The naive approach seems to be: for foo in `some-command` do do-something $foo & done This will run all do-somethings at once, but there are a couple downsides, mainly that do-something may also have some significant I/O which performing all at once might slow down a bit. The other problem is that this code block returns immediately, so no way to do other work when all the do-somethings are finished. How would you write this loop so there are always X do-somethings running at once?

    Read the article

  • bash testing a group of directories for existence

    - by Jim Jones
    Have documents stored in a file system which includes "daily" directories, e.g. 20050610. In a bash script I want to list the files in a months worth of these directories. So I'm running a find command find <path>/200506* -type f >> jun2005.lst. Would like to check that this set of directories is not a null set before executing the find command. However, if I use if[ -d 200506* ] I get a "too many arguements error. How can I get around this?

    Read the article

  • Use of date function in PHP to output a user-friendly date

    - by Jamie
    I have a MySQL database column named DateAdded. I'd like to echo this as a readable date/time. Here is a simplified version of the code I currently have: $result = mysql_query(" SELECT ListItem, DateAdded FROM lists WHERE UserID = '" . $currentid . "' "); while($row = mysql_fetch_array($result)) { // Make the date look nicer $dateadded = date('d-m-Y',$row['DateAdded']); echo $row['ListItem'] . ","; echo $dateadded; echo "<br />"; } Is the use of the date function the best way to output a user-friendly date? Thanks for taking a look,

    Read the article

  • Removing final bash script argument

    - by ctuffli
    I'm trying to write a script that searches a directory for files and greps for a pattern. Something similar to the below except the find expression is much more complicated (excludes particular directories and files). #!/bin/bash if [ -d "${!#}" ] then path=${!#} else path="." fi find $path -print0 | xargs -0 grep "$@" Obviously, the above doesn't work because "$@" still contains the path. I've tried variants of building up an argument list by iterating over all the arguments to exclude path such as args=${@%$path} find $path -print0 | xargs -0 grep "$path" or whitespace="[[:space:]]" args="" for i in "${@%$path}" do # handle the NULL case if [ ! "$i" ] then continue # quote any arguments containing white-space elif [[ $i =~ $whitespace ]] then args="$args \"$i\"" else args="$args $i" fi done find $path -print0 | xargs -0 grep --color "$args" but these fail with quoted input. For example, # ./find.sh -i "some quoted string" grep: quoted: No such file or directory grep: string: No such file or directory Note that if $@ doesn't contain the path, the first script does do what I want.

    Read the article

  • bash find xargs grep only single occurence

    - by keftebub
    hi. maybe it's a bit strange - and maybe there are other tools to do this but, well.. i am using the following classic bash command to find all files which contain some string: find . -type f | xargs grep "something" i have a great number of files, on multiple depths. first occurence of "something" is enough for me, but find continues searching, and takes a long time to complete the rest of the files. what i would like to do is something like a "feedback" from grep back to find so that find could stop searching for more files. is such a thing possible? thank you

    Read the article

  • Move/copy files/folder in linux/solaris using only bash built-ins

    - by KullDox
    There was a situation when somebody moved the whole rootdir into a subdir on a remote system, thus all the system tools like cp, mv, etc didn't work anymore. We had an active session though but couldn't find a way to copy/move the files back using only bash built-ins. Do somebody know of a way to achieve this? I even thought about copy the cp or mv binary in the currentdir with while read -r; do echo $LINE; done and then redirect this to a file, but it didn't work. Guess because of all the special non printable chars in a binary file that can't be copied/displayed using echo. thanks.

    Read the article

  • Bash script to (more or less) reliably check if the internet is up

    - by João Portela
    I need a bash script to put in a cron job that every minute checks if the internet is up. This is how I did it: #! /bin/sh host1=google.com host2=wikipedia.org curr_date=`date +"%Y%m%d%H%M"` echo -n "${curr_date};" ((ping -w5 -c3 $host1 || ping -w5 -c3 $host2) > /dev/null 2>&1) && echo "up" || (echo "down" && exit 1) How would you do it? Which hosts would you ping? Thanks in advance.

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >