Search Results

Search found 3730 results on 150 pages for 'bash'.

Page 23/150 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Parallelize Bash Script

    - by thelsdj
    Lets say I have a loop in bash: for foo in `some-command` do do-something $foo done do-something is cpu bound and I have a nice shiny 4 core processor. I'd like to be able to run up to 4 do-something's at once. The naive approach seems to be: for foo in `some-command` do do-something $foo & done This will run all do-somethings at once, but there are a couple downsides, mainly that do-something may also have some significant I/O which performing all at once might slow down a bit. The other problem is that this code block returns immediately, so no way to do other work when all the do-somethings are finished. How would you write this loop so there are always X do-somethings running at once?

    Read the article

  • bash testing a group of directories for existence

    - by Jim Jones
    Have documents stored in a file system which includes "daily" directories, e.g. 20050610. In a bash script I want to list the files in a months worth of these directories. So I'm running a find command find <path>/200506* -type f >> jun2005.lst. Would like to check that this set of directories is not a null set before executing the find command. However, if I use if[ -d 200506* ] I get a "too many arguements error. How can I get around this?

    Read the article

  • bash find xargs grep only single occurence

    - by keftebub
    hi. maybe it's a bit strange - and maybe there are other tools to do this but, well.. i am using the following classic bash command to find all files which contain some string: find . -type f | xargs grep "something" i have a great number of files, on multiple depths. first occurence of "something" is enough for me, but find continues searching, and takes a long time to complete the rest of the files. what i would like to do is something like a "feedback" from grep back to find so that find could stop searching for more files. is such a thing possible? thank you

    Read the article

  • Removing final bash script argument

    - by ctuffli
    I'm trying to write a script that searches a directory for files and greps for a pattern. Something similar to the below except the find expression is much more complicated (excludes particular directories and files). #!/bin/bash if [ -d "${!#}" ] then path=${!#} else path="." fi find $path -print0 | xargs -0 grep "$@" Obviously, the above doesn't work because "$@" still contains the path. I've tried variants of building up an argument list by iterating over all the arguments to exclude path such as args=${@%$path} find $path -print0 | xargs -0 grep "$path" or whitespace="[[:space:]]" args="" for i in "${@%$path}" do # handle the NULL case if [ ! "$i" ] then continue # quote any arguments containing white-space elif [[ $i =~ $whitespace ]] then args="$args \"$i\"" else args="$args $i" fi done find $path -print0 | xargs -0 grep --color "$args" but these fail with quoted input. For example, # ./find.sh -i "some quoted string" grep: quoted: No such file or directory grep: string: No such file or directory Note that if $@ doesn't contain the path, the first script does do what I want.

    Read the article

  • Move/copy files/folder in linux/solaris using only bash built-ins

    - by KullDox
    There was a situation when somebody moved the whole rootdir into a subdir on a remote system, thus all the system tools like cp, mv, etc didn't work anymore. We had an active session though but couldn't find a way to copy/move the files back using only bash built-ins. Do somebody know of a way to achieve this? I even thought about copy the cp or mv binary in the currentdir with while read -r; do echo $LINE; done and then redirect this to a file, but it didn't work. Guess because of all the special non printable chars in a binary file that can't be copied/displayed using echo. thanks.

    Read the article

  • Bash script to (more or less) reliably check if the internet is up

    - by João Portela
    I need a bash script to put in a cron job that every minute checks if the internet is up. This is how I did it: #! /bin/sh host1=google.com host2=wikipedia.org curr_date=`date +"%Y%m%d%H%M"` echo -n "${curr_date};" ((ping -w5 -c3 $host1 || ping -w5 -c3 $host2) > /dev/null 2>&1) && echo "up" || (echo "down" && exit 1) How would you do it? Which hosts would you ping? Thanks in advance.

    Read the article

  • bash: How to evaluate PS1, PS2, ...?

    - by Harry
    Is there any way to 'evaluate' PS1, PS2, etc from within a bash script? Although, I can use alternate means to get all elements of my current PS1, I would really like to be able to reuse its definition instead of using these alternate means. For example, ===================================== PS1 element --> Alternate means ===================================== \u --> $USER \h --> $HOSTNAME \w --> $PWD ... ===================================== I could very well use the 'alternate means' column in my script, but I don't want to. In my PS1, I, for example, use bold blue color via terminal escape sequences which I'd like to be able to simply reuse by evaluating PS1.

    Read the article

  • Use a cat + grep as an included source in bash

    - by Andrew
    I'm on a shared webhost where I don't have permission to edit the global bash configuration file at /ect/bashrc. Unfortunately there is one line in the global file, mesg y, which puts the terminal in tty mode and makes scp and similar commands unavailable. My local ~./bashrc includes the global file as a source, like so: # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi My current workaround uses cat and grep to output the global file, sans offending line, into a local file and use that as a source. # Source global definitions if [ -f /etc/bashrc ]; then cat /etc/bashrc | grep -v mesg > ~/.bash_global . ~/.bash_global fi Is there a way to do include a grokked file like this without the intermediate step of creating an actual file? Something like this? . cat /etc/bashrc | grep -v mesg > ~/.bash_global

    Read the article

  • Variable loss in redirected bash while loop

    - by James Hadley
    I have the following code for ip in $(ifconfig | awk -F ":" '/inet addr/{split($2,a," ");print a[1]}') do bytesin=0; bytesout=0; while read line do if [[ $(echo ${line} | awk '{print $1}') == ${ip} ]] then increment=$(echo ${line} | awk '{print $4}') bytesout=$((${bytesout} + ${increment})) else increment=$(echo ${line} | awk '{print $4}') bytesin=$((${bytesin} + ${increment})) fi done < <(pmacct -s | grep ${ip}) echo "${ip} ${bytesin} ${bytesout}" >> /tmp/bwacct.txt done Which I would like to print the incremented values to bwacct.txt, but instead the file is full of zeroes: 91.227.223.66 0 0 91.227.221.126 0 0 127.0.0.1 0 0 My understanding of Bash is that a redirected for loop should preserve variables. What am I doing wrong?

    Read the article

  • How to know from a bash script if the user abruptly closes ssh session

    - by Figo
    I have a bash script that acts as the default shell for a user loging in trough ssh. It provides a menu with several options one of wich is sending a file using netcat. The netcat of the embedded linux I'm using lacks the -w option, so if the user closes the ssh connection without ever sending the file, the netcat command waits forever. I need to know if the user abruptly closes the connection so the script can kill the netcat command and exit gracefully. Things I've tried so far: Trapping the SIGHUP: it is not issued. The only signal issued i could find is SIGCONT, but I don't think it's reliable and portable. Playing with the -t option of the read command to detect a closed stdin: this would work if not for a silly bug in the embedded read command (only times out on the first invocation)

    Read the article

  • Bash - Adding 0's in the middle of a file name

    - by user596691
    I have a bunch of files which are named: mem0.csv mem1.csv . . . . mem153.csv . . . They are all in the same folder. When I do ls in the folder they appear in an order of mem0.csv mem1.csv mem10.csv mem100.csv . . . mem2.csv mem20.csv . . . I want to create a bash script to push 0's between mem and the number. I figure that I need to add 0's until all the filenames are of the same length only problem is that I don't know how to do this.

    Read the article

  • How to create a bash function with variable parameters/arguments to grep several keywords/tags

    - by CornSmith
    I'm using the :!grep "tag1" filename | grep "tag2" filename | grep -n "tag3 or more" filename command in vim to search for my code snippets based on their tags (a simple comment at the top of a snippet) in one big file. I use snippets to remember tricky things. This is painful to write out each time. I'd like to make an alias, or function to do something like this: :!greptag tag1 tag2 ... tag39 And it should search the current doc and return the lines with all the tags on them. Vim is set to interactive shell mode so that it can parse my bashrc for aliases/functions. set shellcmdflag=-ic How can I construct a function that allows for variable arguments like this in bash?

    Read the article

  • Linux - Bash Redirect a String to a file

    - by user3502786
    I wrote a simple script that is reading the file content and incrementing a a number inside this file, then i'm holding the change using awk, when i'm trying ro redirect the new String using '' the whole string is redirected in one line and not like the original was which is 4 lines. #!/bin/bash -x # This script is for Incrementing build numbers path=/home/RND/abrodov file=tst.txt tst=`cat $path/$file` printf "this is the content of the file before incrementing: \n $tst" newexpr=`awk '/^Build Number/{$4=$4+1;}1' /home/RND/abrodov/tst.txt` printf "\n the new content \n $newexpr" echo $newexpr > $path/$file This is the original file before running the script: Major Release Number = 4 Minor Release Number = 1 Service Pack Release Number = 2 Build Number = 22 This is the content after i used the script: Major Release Number = 4 Minor Release Number = 1 Service Pack Release Number = 2 Build Number = 23 I'm trying to figure out how can i redirect the text in the original format which is 4 lines.

    Read the article

  • Writing bash script for X-11 forwarding

    - by Bruce
    I was having problem with SSH X-11 forwarding while I used sudo. I found a solution for it. $hostname server4.a.b.edu First I do: $ echo $DISPLAY localhost:10.0 then $ xauth list server1.a.b.edu/unix:12 MIT-MAGIC-COOKIE-1 6026864294a0e081ac452e8740bcd0fe server4.a.b.edu/unix:10 MIT-MAGIC-COOKIE-1 f01fbfe0c0d68e30b45afe3829b27e58 Then I need to do $ sudo xauth add server4.a.b.edu/unix:10 MIT-MAGIC-COOKIE-1 f01fbfe0c0d68e30b45afe3829b27e58 for sudo to work, for the cookie with my server name and display. How do I write a bash script to automate this?

    Read the article

  • Bash init.d script detect that mysqld has started and is running

    - by Ricket
    I'm working on my dedicated server running CentOS. I found out that one of my applications which starts up via a script in /etc/init.d/ requires MySQL to be running, or else it throws an error, so essentially I currently have to start it by hand. How can I detect, in a bash script (#!/bin/sh), whether the MySQL service has started yet? Is there some way to poll port 3306 until it is open to accept connections, and only then continue with the script? Or maybe set an order so that the script doesn't run until the mysqld script runs?

    Read the article

  • BASH Script to cd to directory with spaces in pathname

    - by Rails Newbie
    Argggg. I've been struggling with this stupid problem for days and I can't find an answer. I'm using BASH on Mac OS X and I'd like to create a simple executable script file that would change to another directory when it's run. However, the path to that directory has spaces in it. How the heck do you do this? This is what I have... Name of file: cdcode File contents: cd ~/My Code Now granted, this isn't a long pathname, but my actual pathname is five directories deep and four of those directories have spaces in the path. BTW, I've tried cd "~/My Code" and cd "~/My\ Code" and neither of these worked. If you can help, THANKS! This is driving me crazy!!

    Read the article

  • Modify bash variables with sed

    - by Alexander Cska
    I am trying to modify a number of environmental variables containing predefined compiler flags. To do so, I tried using a bash loop that goes over all environmental variables listed with "env". for i in $(env | grep ipo | awk 'BEGIN {FS="="} ; { print $1 } ' ) do echo $(sed -e "s/-ipo/ / ; s/-axAVX/ /" <<< $i) done This is not working since the loop variable $i contains just the name of the environmental variable stored as a character string. I tried searching a method to convert a string into a variable but things started becoming unnecessary complicated. The basic problem is how to properly supply the environmental variable itself to sed. Any ideas how to properly modify my script are welcome. Thanks, Alex

    Read the article

  • Print a file skipping X lines in Bash

    - by Eduardo
    Hi I have a very long file which I want to print but skipping the first 1e6 lines for example. I look into the cat man page but I did not see nay option to do this. I am looking for a command to do this or a simple bash program. I know how to do it using a program in C but I want to do it using the common commands. Any way to do it? Thanks a lot in advance..

    Read the article

  • read in bash on tab-delimited file without empty fields collapsing

    - by Charles Duffy
    I'm trying to read a multi-line tab-separated file in bash. The format is such that empty fields are expected. Unfortunately, the shell is collapsing together field separators which are next to each other, as so: # IFS=$'\t' # read one two three <<<$'one\t\tthree' # printf '<%s> ' "$one" "$two" "$three"; printf '\n' <one> <three> <> ...as opposed to the desired output of <one> <> <three>. Can this be resolved without resorting to a separate language (such as awk)?

    Read the article

  • bash script to find/grep particular string in xml files in particular folders

    - by user3702188
    i have a problem at work, where i need to simplify the process how i searrh for logs. I would like to ask for help from experts here. We have different services for every channel. The structure is following: - root/channel_1/service_1/2014-05-21/file_54544654541.xml - root/channel_1/server2_2/2014-05-20/file_74272172.xml - root/channel_1/service_3/2014-05-22/file_45456546.xml - root/channel_2/service_4/2014-05-23/file_78754456.xml - root/channel_2/service_5/2014-05-24/file_546546546.xml my main problem is to find particular string in these xml files. Lets say, i know the channel name but i dont know the service name under which my particular string should be present. Also i know the date. So in search i want to enter the channel name the date and string. The search would be going via all service folders and looking for string only in all xml files under particular date folder and particular channel. any ideas for quickest and easiest solution to achieve this? Either by bash or perl? Any help would be appreciated thanks

    Read the article

  • Using read in bash without empty fields collapsing

    - by Charles Duffy
    I'm trying to read a multi-line tab-separated file in bash. The format is such that empty fields are expected. Unfortunately, the shell is collapsing together field separators which are next to each other, as so: # IFS=$'\t' # read one two three <<<$'one\t\tthree' # printf '<%s> ' "$one" "$two" "$three"; printf '\n' <one> <three> <> ...as opposed to the desired output of <one> <> <three>. Can this be resolved without resorting to a separate language (such as awk)?

    Read the article

  • generate php classes in bash

    - by Derek
    i have this script: #!/bin/bash if [[ -z "$1" ]] ; then echo "Class is required" exit 1; fi if [[ -z "$2" ]] ; then package="Default" else package=$2; fi echo "<?php /** * $1.class.php * * Vcard class file. * @name Project * @author Author * @link http://www.domain.com * @copyright Copyright © 2011 * @package $package * @version 1.0 */ /** * The main $1 class * @package $package */ class $1 { /** * Constructor setup. */ public function __construct() { } /** * Destructor setup. */ public function __destruct() { } } " > $1.class.php php -l $1.class.php echo "Done"; if i do: ./generate.sh my_class it creates everything with my_class. how can i modify this to: MyClass? i need to use MyClass for the filename, and the class name etc... later in the code i use the argument (in this case my_class) for some other purposes. thanks

    Read the article

  • bash check if value in list

    - by tkoomzaaskz
    I've got a script with a variable taken from command line parameters. I want to check if it's value is one of dev, beta or prod. I've got following code snippet: #!/usr/bin/env bash ENV_NAME=$1 echo "env name = $ENV_NAME" ENVIRONMENTS=('dev','beta','prod') if [[ $ENVIRONMENTS =~ $ENV_NAME ]]; then echo 'correct' exit else echo 'incorrect' exit fi When I run my script, it doesn't matter which parameters I pass: ./script.sh beta or ./script.sh or ./script.sh whatever, I always get correct echoed. What is wrong in my script?

    Read the article

  • bash copy with variable

    - by zaf
    I'm trying to copy files to the current directory using a bash script. In order to handle paths that need escaping a variable is used that is escaped and then supplied to the cp command. The cp command is complaining with: usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory I know what that means but I cannot understand why that happens. Here is the code: z="/a/b/c d e f.txt" y=`printf %q "$z"` cp $y x.txt # not working as expected echo cp $y x.txt # output is "cp /a/b/c\ d\ e\ f.txt x.txt"

    Read the article

  • Extract substructure from a text file using bash or python

    - by Werner
    Hi, I have a huge text file, which follows the structure: SET TAG1 ... ... SET ... SET TAG2 ... ... SET ... ... I would like to extract for a specific TAG, (i.e. TAG54) its individual "substructure", which would be SET TAG54 ... ... SET Each substructure, for a given TAG_i contains always: first line:SET second line:TAG_i (in this case TAG54) an arbitrary number of lines last line:SET I wonder what would be the best way to do this, whether in bash or python, so for a given TAG, one can "extract" this substructure. Thanks

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >