Search Results

Search found 5228 results on 210 pages for 'bash alias'.

Page 93/210 | < Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >

  • Linux: programatically setting a permanent environment variable

    - by Richard
    Hello All, I am writing a little install script for some software. All it does is unpack a target tar, and then i want to permanently set some environment variables - principally the location of the unpacked libs and updating $PATH. Do I need to programmatically edit the .bashrc file, adding the appropriate entries to the end for example, or is there another way? What's standard practice? Thanks

    Read the article

  • How to write a program that mimics Fiddler by using tcpdump or from scratch?

    - by ????
    When Fiddler is not on Mac OS X or Ubuntu, and if we don't install/use Wireshark or any other more heavy duty tools, what is a way to use tcpdump so that 1) It can print out GET /foo/bar HTTP/1.1 [request content in RAW text] [response content in RAW text] POST /foo/... HTTP/1.1 this should be able to be done by tcpdump or by using tcpdump in a short shell script or Ruby / Python / Perl script. 2) Actually, it can be neat if a script can output HTML, with GET /foo/bar HTTP/1.1 POST /foo/... HTTP/1.1 on the page, for any browser to display, and then when clicked on any of those lines, it will expand to show the RAW content like (1) above does. Click again and it will hide the details. The expansion UI can be done using jQuery or any JS library. The script may be short... possibly less than 20 lines? Does anybody know how to do it either for (1) or (2)?

    Read the article

  • Using placeholders/variables in a sed command

    - by jesse_galley
    I want to store a specific part of a matched result as a variable to be used for replacement later. I would like to keep this in a one liner instead of finding the variable I need before hand. when configuring apache, and use mod_rewrite, you can specificy specific parts of patterns to be used as variables,like this: RewriteRule ^www.example.com/page/(.*)$ http://www.example.com/page.php?page=$1 [R=301,L] the part of the pattern match that's contained inside the parenthesis is stored as $1 for use later. So if the url was www.example.com/page/home, it would be replaced with www.example.com/page.php?page=home. So the "home" part of the match was saved in $1 because it was the part of the pattern inside the parenthesis. I want something like this functionality with a sed command, I need to automatically replace many strings in a SQL dump file, to add drop table if exist commands before each create table, but I need to know the table name to do this, so if the dump file contains something like: ... CREATE TABLE `orders` ... I need to run something like: cat dump.sql | sed "s/CREATE TABLE `(.*)`/DROP TABLE IF EXISTS $1\N CREATE TABLE `$1`/g" to get the result of: ... DROP TABLE IF EXISTS `orders` CREATE TABLE `orders` ... I'm using the mod_rewrite syntax in the sed command as a logical example of what I'm trying to do. Any suggestions?

    Read the article

  • Change -- to . for all files in a directory

    - by Larry
    Hello, I need to rename all the files in a directory. Some examples of the source filenames are: alpha--sometext.381928 comp--moretext.7294058 The resultant files would be renamed as: alpha.sometext.381928 comp.moretext.7294058 The number of characters before and after the -- is not consistant. The script needs to work on current installations of Ubuntu and FreeBSD. These are lean LAMP servers so only the necessary packages have been installed. Thanks

    Read the article

  • Create variables for unknown amount of arguments?

    - by user347600
    Working on an rsync script and the portion below is in a for loop. What I want to achieve is assign a variable to every arguement after 3. Just confused if I need to create another loop for that or not: #1: name name=$1 #2: ip ip=$2 #3: user user=$3 #4+: folder exlusion #any lines higher than 3 will be created as exlcude folders ex[ARG_NUMBER]=

    Read the article

  • What is common between environments within a shell terminal session?

    - by Matt1776
    I have a custom shell script that runs each time a user logs in or identity is assumed, its been placed in /etc/profile.d and performs some basic env variable operations. Recently I added some code so that if screen is running it will reattach it without needing me to type anything. There are some problems however. If I log-in as root, and su - to another user, the code runs a second time. Is there a variable I can set when the code runs the first time that will prevent a second run of the code? I thought to write something to the disk but then I dont want to prevent the code from running if I begin a new terminal session. Here is the code in question. It first attempts to reattach - if unsuccessful because its already attached (as it might be on an interruped session) it will 'take' the session back. screen -r if [ -z "$STY" ]; then exec screen -dR fi Ultimately this bug prevents me from substituting user to another user because as soon as I do so, it grabs the screen session and puts me right back where I started. Pretty frustrating

    Read the article

  • List content of tar file or a directory only down to some level

    - by Tim
    I wonder how to list content of tar file only down to some level? I understand "tar tvf mytar.tar" will list all files, but sometimes I wish I can just see directories down to some level. Similarly for command "ls" how to control the level of subdirectories that will be displayed? By default, it will only show the direct subdirectories, but not further. Thanks and regards

    Read the article

  • Splitting string into array upon token

    - by Gnutt
    I'm writing a script to perform an offsite rsync backup, and whenever the rsyncline recieves some output it goes into a single variable. I then want to split that variable into an array upon the ^M token, so that I can send them to two different logger-sessions (so I get them on seperate lines in the log). My current line to perform the rsync result=rsync --del -az -e "ssh -i $cert" $source $destination 2>&1 Result in the log, when the server is unavailable ssh: connect to host offsite port 22: Connection timed out^M rsync: connection unexpectedly closed (0 bytes received so far) [sender] rsync error: unexplained error (code 255) at io.c(601) [sender=3.0.7]

    Read the article

  • What is the most robust way of determining the current codepage from a shell script?

    - by rewbs
    Hi all, I'd like to determine the environment's current codepage at runtime from a Unix shell script. What's the most reliable way of doing this? I'm looking into parsing environment variable $LC_ALL, but it isn't always set to a useful value, and its format seems to vary (can be <locale, or <locale.<code page, or <locale.<code page@<modifier etc...). Is there a better way? I'm essentially after a shell equivalent of what I'd get if I called nl_langinfo(CODESET) from C.

    Read the article

  • Shell Script Variable Quoting Problem

    - by apinstein
    I have an sh script that contains the line $PHP_COMMAND -r 'echo get_include_path();' I can not edit this script, but I need the eventual command line to be (equivalent to) php -d include_path='/path/with spaces/dir' -r 'echo get_include_path();' How can I achieve this? Below is a script that demonstrates the problem. #!/bin/sh # shell script quoting problem demonstration # I need to be able to set a shell variable with a command with # some options, like so PHP_COMMAND="php -d 'include_path=/path/with spaces/dir'" # then use PHP_COMMAND to run something in another script, like this: $PHP_COMMAND -r 'echo get_include_path();' # the above fails when executed. However, if you copy/paste the output # from this line and run it in the CLI, it works! echo "$PHP_COMMAND -r 'echo get_include_path();'" php -d include_path='/path/with spaces/dir' -r 'echo get_include_path();' # what's going on? # this is also interesting echo "\n--------------------" # this works great, but only works if include_path doesn't need quoting PHP_COMMAND="php -d include_path=/path/to/dir" echo "$PHP_COMMAND -r 'echo get_include_path();'" $PHP_COMMAND -r 'echo get_include_path();' echo "\n--------------------" # this one doesn't when run in the sh script, but again if you copy/paste # the output it does work as expected. PHP_COMMAND="php -d 'include_path=/path/to/dir'" echo "$PHP_COMMAND -r 'echo get_include_path();'" $PHP_COMMAND -r 'echo get_include_path();' Script also available online: http://gist.github.com/276500

    Read the article

  • Text substitution (reading from file and saving to the same file) on linux with sed...

    - by Roger
    I want to read the file "teste", make some "find&replace" and overwrite "teste" with the results. The closer i got till now is: $cat teste I have to find something This is hard to find... Find it wright now! $sed -n 's/find/replace/w teste1' teste $cat teste1 I have to replace something This is hard to replace... If I try to save to the same file like this: $sed -n 's/find/replace/w teste' teste or: $sed -n 's/find/replace/' teste > teste The result will be a blank file... I know I am missing something very stupid but any help will be welcome. UPDATE: Based on the tips given by the folks and this link: http://idolinux.blogspot.com/2008/08/sed-in-place-edit.html here's my updated code: sed -i -e 's/find/replace/g' teste

    Read the article

  • Copy/publish images linked from the html files to another server and update the HTML files referenci

    - by Phil
    I am publishing content from a Drupal CMS to static HTML pages on another domain, hosted on a second server. Building the HTML files was simple (using PHP/MySQL to write the files). I have a list of images referenced in my HTML, all of which exist below the /userfiles/ directory. cat *.html | grep -oE [^\'\"]+userfiles[\/.*]*/[^\'\"] | sort | uniq Which produces a list of files http://my.server.com/userfiles/Another%20User1.jpg http://my.server.com/userfiles/image/image%201.jpg ... My next step is to copy these images across to the second server and translate the tags in the html files. I understand that sed is probably the tool I would need. E.g.: sed 's/[^"]\+userfiles[\/image]\?\/\([^"]\+\)/\/images\/\1/g' Should change http://my.server.com/userfiles/Another%20User1.jpg to /images/Another%20User1.jpg, but I cannot work out exactly how I would use the script. I.e. can I use it to update the files in place or do I need to juggle temporary files, etc. Then how can I ensure that the files are moved to the correct location on the second server

    Read the article

  • Word frequency tally script is too slow

    - by Dave Jarvis
    Background Created a script to count the frequency of words in a plain text file. The script performs the following steps: Count the frequency of words from a corpus. Retain each word in the corpus found in a dictionary. Create a comma-separated file of the frequencies. The script is at: http://pastebin.com/VAZdeKXs Problem The following lines continually cycle through the dictionary to match words: for i in $(awk '{if( $2 ) print $2}' frequency.txt); do grep -m 1 ^$i\$ dictionary.txt >> corpus-lexicon.txt; done It works, but it is slow because it is scanning the words it found to remove any that are not in the dictionary. The code performs this task by scanning the dictionary for every single word. (The -m 1 parameter stops the scan when the match is found.) Question How would you optimize the script so that the dictionary is not scanned from start to finish for every single word? The majority of the words will not be in the dictionary. Thank you!

    Read the article

  • find: missing argument to -exec

    - by Abs
    Hello all, I was helped out today with a command, but it doesn't seem to be working. This is the command: find /home/me/download/ -type f -name "*.rm" -exec ffmpeg -i {} -sameq {}.mp3 && rm {}\; The shell returns find: missing argument to `-exec' What I am basically trying to do is go through a directory recursively (if it has other directories) and run the ffmpeg command on the .rm file types and convert them to .mp3 file types. Once this is done, remove the .rm file that has just been converted. I appreciate any help on this.

    Read the article

  • Linux: Find all symlinks of a given 'original' file? (reverse 'readlink')

    - by sdaau
    Hi all, Consider the following command line snippet: $ cd /tmp/ $ mkdir dirA $ mkdir dirB $ echo "the contents of the 'original' file" > orig.file $ ls -la orig.file -rw-r--r-- 1 $USER $USER 36 2010-12-26 00:57 orig.file # create symlinks in dirA and dirB that point to /tmp/orig.file: $ ln -s $(pwd)/orig.file $(pwd)/dirA/ $ ln -s $(pwd)/orig.file $(pwd)/dirB/lorig.file $ ls -la dirA/ dirB/ dirA/: total 44 drwxr-xr-x 2 $USER $USER 4096 2010-12-26 00:57 . drwxrwxrwt 20 root root 36864 2010-12-26 00:57 .. lrwxrwxrwx 1 $USER $USER 14 2010-12-26 00:57 orig.file -> /tmp/orig.file dirB/: total 44 drwxr-xr-x 2 $USER $USER 4096 2010-12-26 00:58 . drwxrwxrwt 20 root root 36864 2010-12-26 00:57 .. lrwxrwxrwx 1 $USER $USER 14 2010-12-26 00:58 lorig.file -> /tmp/orig.file At this point, I can use readling to see what is the 'original' (well, I guess the usual term here is either 'target' or 'source', but those in my mind can be opposite concepts as well, so I'll just call it 'original') file of the symlinks, i.e. $ readlink -f dirA/orig.file /tmp/orig.file $ readlink -f dirB/lorig.file /tmp/orig.file ... However, what I'd like to know is - is there a command I could run on the 'original' file, and find all the symlinks that point to it? In other words, something like (pseudo): $ getsymlinks /tmp/orig.file /tmp/dirA/orig.file /tmp/dirB/lorig.file Thanks in advance for any comments, Cheers!

    Read the article

  • Make Tar + gzip ignore directory paths

    - by norm
    Anybody know if it is possible that when making a tar + gzip through 'tar c ...' command if the relative paths will be ignored upon expanding. e.g. tar cvf test.tgz foo ../../files/bar and then expanding the test.tgz with: tar xvf test.tgz gives a dir containing: foo files/bar i want the dir to contain the files foo bar is this possible?

    Read the article

  • Random password variable disappears

    - by snaken
    Hi, I'm using the following to generate a random password in a shell script: DBPASS=</dev/urandom tr -dc A-Za-z0-9| (head -c $1 > /dev/null 2>&1 || head -c 8) When i run this in a file on its own like this: #!/bin/sh DBPASS=</dev/urandom tr -dc A-Za-z0-9| (head -c $1 > /dev/null 2>&1 || head -c 8) echo $DBPASS A password is echoed. When i incorporate it into a larger script though the variable never seems to get created for some reason, so for example this doesn't work: DBPASS=</dev/urandom tr -dc A-Za-z0-9| (head -c $1 > /dev/null 2>&1 || head -c 8) sed -i s/oldpass/$DBPASS/ mysql_connect.php If i manually set the variable though everything is fine.. can anyone see why?

    Read the article

  • Sed-replacing a pattern

    - by grails_enthu
    I have below code: <td nowrap="nowrap" width="74"> <p align="center">server1</p> </td> <td nowrap="nowrap" width="74"> <p align="center">server2</p> </td> and so on.I want to get output as: <td nowrap="nowrap" width="74">server1</td> <td nowrap="nowrap" width="74">server2</td> What should be my approach?Say for example the file is server.html I have done something like this: sed "s/<p align="center">*</p>/*/" -i server.html But its not working.

    Read the article

  • how to run few vim commands in a raw

    - by temujin.ya.ru
    Hello. This is really noob question. There is set of vim commands : command1 : command2 etc., which I would neet to type in in a raw quite often. How to I make it automatic? It is simple regexp replace command set, however I cannot script those in sed, since it involves non-latin locales and for some reason vim handles non-latin regexps correctly, while sed not.

    Read the article

  • Is there a way to set the value of $? in a mock in Ruby?

    - by rleber
    I am testing some scripts that interface with system commands. Their logic depends on the return code of the system commands, i.e. the value of $?. So, as a simplified example, the script might say: def foo(command) output=`#{command}` if $?==0 'succeeded' else 'failed' end end In order to be able to test these methods properly, I would like to be able to stub out the Kernel backquote call, and set $? to an arbitrary value, to see if I get appropriate behavior from the logic in the method after the backquote call. I can't figure out a way to do this. (In case it matters, I'm testing using Test::Unit and Mocha.)

    Read the article

  • Notify via email if something wrong got happened in the shell script

    - by Nevzz03
    fileexist=0 for i in $( ls /data/read-only/clv/daily/Finished-HADOOP_EXPORT_&processDate#.done); do mv /data/read-only/clv/daily/Finished-HADOOP_EXPORT_&processDate#.done /data/read-only/clv/daily/archieve-wip/ fileexist=1 done --some other script below Above is the shell script I have in which in the for loop, I am moving some files. I want to notify myself via email if something wrong got happened in the moving process, as I am running this script on the Hadoop Cluster, so it might be possible that cluster went down while this was running etc etc. So how can I have better error handling mechanism in this shell script? Any thoughts?

    Read the article

  • Is it possible to get a graphical representation of gprof results?

    - by Werner
    Hi, I am interested in getting the profiling of some number crunching program. I compiled it with -g and -pg options and linked it and got it gmon.out. After reading the info (plain text) it looks a bit ugly. I wonder if there are some open source tools for getting a graphical representation of the 10 functions where the program spends the most of the time as well as a flux diagram. Thanks

    Read the article

< Previous Page | 89 90 91 92 93 94 95 96 97 98 99 100  | Next Page >