Search Results

Search found 6392 results on 256 pages for 'bash history'.

Page 14/256 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • How to use HTML markup tags inside Bash script

    - by CONtext
    I have crontab and a simple bash script that sends me emails every often containing PHP, NGINX, MYSQL errors from their log files. This is a simplified example. #/home/user/status.sh [email protected] PHP_ERROR=`tail -5 /var/log/php-fpm/error.log` NGINX_ERROR=`tail -5 /var/log/nginx/error.log` MYSQL_ERROR=`tail /var/log/mysqld.log` DISK_SPACE=`df -h` echo " Today's, server report:: ================================== DISK_SPACE: $DISK_SPACE --------------------------------- MEMORY_USAGE: $MEMORY_USAGE ----------------------------------- NGINX ERROR: $NGINX_ERROR ----------------------------------- PHP ERRORS: $PHP_ERROR ------------------------------------ MYSQL_ERRORS: $MYSQL_ERROR ------------------------------------- " | mail -s "Server reports" $EMAIL I know this is a very basic usage, but as you can see, I am trying to separate the errors, but not of the html tags including \n are working. So, my question is, is it possible to use HTML tags to format the text, if not .. then what are the alternatives.

    Read the article

  • selective backup script in bash

    - by Sake
    Hi, I've been using this simple command (that's all I can do :) to backup the whole tree from my user data in NAS server for a year. cp -r /STORAGE /BACKUP-STORAGE/YYYY-MM-DD Unfortunately, after a year of service. My user start filling the spaces with lot of photo and cliparts (jpg, gif, bmp) And that start to make my backup process get much slower. The space is also a big issue. Now I no longer have enough space for a week-long daily backup set. I think I want to change from backup everything to backup only non-image data. How can I exclude jpg, gif, and bmp from the backup ? It's quite easy with DOS XCOPY command, but I really have no idea how to do that in bash. Thanks

    Read the article

  • Mac OS X bash: uninstall all apache and PHP occurences

    - by fireeyedboy
    Hi all, How do I find all installed apache and PHP occurences on a Mac OS X system in bash and uninstall them? My motivation: I've managed to install multiple apache and PHP packages (I believe it's called packages in Unix terms, correct?) at some point, and I'ld like to start out fresh again, without completely re-installing Mac OS X again. Also, I'ld like to install suPHP along with apache this time, and I believe I need to compile apache with some additional stuff for that. But that is a challenge I'll deal with later, and probably ask a question about then. :) Thank you in advance for your info.

    Read the article

  • execute a command in all subdirectories bash

    - by Luigi R. Viggiano
    I have a directory structure composed by: iTunes/Music/${author}/${album}/${song.mp3} I implemented a script to strip my mp3 bitrate to 128 kbps using lame (which works on a single file at time). My script looks like this 'normalize_mp3.sh': #!/bin/bash SAVEIFS=$IFS IFS=$(echo -en "\n\b") for f in *.mp3 do lame --cbr $f __out.mp3 mv __out.mp3 $f done IFS=$SAVEIFS This works fine, if I go folder by folder and execute this command. But I'd like to have a "global" command, like in 4DOS so I can run: $ cd iTunes/Music $ global normalize_mp3.sh and the global command would traverse all subdirs and execute the normalize_mp3.sh to strip all my mp3 in all subfolders. Anyone knows if there is a unix equivalent to the 4dos global command? I tried to play with find -exec but I just managed to get an headache.

    Read the article

  • Implementing dry-run in bash scripts

    - by Apikot
    How would one implement a dry-run option in a bash script? I can think of either wrapping every single command in an if and echoing out the command instead of running it if the script is running with dry-run. Another way would be to define a function and then passing each command call through that function. Something like: function _run () { if [[ "$DRY_RUN" ]]; then echo $@ else $@ fi } _run mv /tmp/file /tmp/file2 DRY_RUN=true _run mv /tmp/file /tmp/file2 Is this just wrong and there is a much better way of doing it?

    Read the article

  • How to interpret the bash command "usage" syntax?

    - by raoulsson
    How exactly do you have to interpret the output of a commands "usage" output, in bash for example. For example, on my OS X, cp gives me usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory What does the nested options, like -H within -R, indicate? Does upper and lower case have any meaning? When is an argument optional, required? I need to implement a telnet command line against a program of mine and I would like to get this straight.

    Read the article

  • Implementing dry-run in bash scripts

    - by Andrei Serdeliuc
    How would one implement a dry-run option in a bash script? I can think of either wrapping every single command in an if and echoing out the command instead of running it if the script is running with dry-run. Another way would be to define a function and then passing each command call through that function. Something like: function _run () { if [[ "$DRY_RUN" ]]; then echo $@ else $@ fi } `_run mv /tmp/file /tmp/file2` `DRY_RUN=true _run mv /tmp/file /tmp/file2` Is this just wrong and there is a much better way of doing it?

    Read the article

  • Files listed by bash but unaccessible

    - by Cerin
    What would cause the following behavior on an Ubuntu 12.04 system? I've SSHed into a machine as the "ubuntu" user. Running ls -lah /data/* shows dozens of non-empty files (e.g. file1.txt, file2.txt, etc), all owned by the "ubuntu" user/group, and with full read/write access. If I try to cat /data/file1.txt, bash gives me the error "cat: /data/file1.txt: No such file or directory" In short, ls is listing files, but in every other way, the files essentially don't exist. I can't cat them or read them in any way. Even giving all the files 777 permission doesn't change anything. This is really bizarre. What's going on here?

    Read the article

  • VIM zsh, bash and colors in command line on Ubuntu

    - by Jacek Wysocki
    I have problem with VIM command line when calling system commands. e.g. !ls, all command output colors aren't parsed by VIM. My system is Ubuntu 12.04 LTS with VIM 7.3.429 from Ubuntu repositories. Is there any workaround for this problem? EDIT: My vimrc file :!echo $TERM in VIM returns : dumb EDIT2: I found a simple workaround but it's not perfect if [ "$VIM" ] && [ "$TERM" = "dumb" ] then # For gvim's monochromatic :shell PS1='\n\u@\h \w\n\$ ' unalias ls unalias grep fi (It's working on bash)

    Read the article

  • sudo displays typed password in bash script

    - by Andy
    Hullo, I have a bash script that uses sudo a few times. There's a couple of strange points about it though. It asks me for my password a few seconds after I've already entered it for a previous command. The second time I enter my password, it's echoed to the display. Here's the relevant bits of the script. sudo service apache2 stop drush sql-dump --root="$SITE_DIR" --structure-tables-key=svn --ordered-dump | grep -iv 'dump completed on' | sudo tee "$DB_DIR/${SITE_NAME}.sql" > /dev/null sudo svn diff "$DB_DIR" | less sudo svn commit -m "$MESSAGE" "$DB_DIR" sudo service apache2 start The first password is to stop apache, and it works as expected. As mentioned, the sudo tee doesn't 'remember' that I have elevated privileges, asks for the password again, and echoes it to the screen. Given that tee is all about echoing to screen, I've played around a little with simple scripts which have | sudo tee, and they all work as expected. Ideas?! TIA Andy

    Read the article

  • How to avoid tilde ~ in Bash prompt?

    - by Jirka
    Hello! I have set my prompt in bash in a such way that I can use it directly in scp command: My current PS1 string: PS1="\h:\w\n$" And the prompt looks like this: lnx-hladky:/tmp/plugtmp $ What I don't like at all is the fact that $HOME directory is displayed as tilde. Can this be avoided? It's causing problems when switching between different users. Example: lnx-hladky:~/DOC $ Documentation says: \w : the current working directory, with $HOME abbreviated with a tilde \W: the basename of the current working directory, with $HOME abbreviated with a tilde Is there any possibility to avoid $HOME being abbreviated with a tilde? I have found one way around but I feel like it's overcomplicated: PROMPT_COMMAND='echo -ne "\e[4;35m$(date +%T)\e[24m$(whoami)@$(hostname):$(pwd)\e[m\n"' PS1=$ Can anyone propose a better solution? I have a feeling it's not quite OK to run so many commands just to get prompt. (date,whoami,hostname,pwd). Thanks a lot! Jirka

    Read the article

  • Unexpected behavior in Bash

    - by cYrus
    From man bash: A simple command is a sequence of optional variable assignments followed by blank-separated words and redirections, and terminated by a control operator. The first word specifies the command to be executed, and is passed as argument zero. The remaining words are passed as arguments to the invoked command. So it's perfectly legal to write: foo=bar echo $foo but it doesn't work as I expect (it prints just a newline). It's quite strange to me since: $ foo=bar printenv foo=bar TERM=rxvt-unicode [...] Could someone please explain me where I'm doing wrong?

    Read the article

  • bash : "set -e" and check if a user exists make script exit

    - by Dahmad Boutfounast
    i am writing a script to install a program with bash, i want to exit on error so i added "set -e" in the beginning of my script. the problem is that i have to check if a user exists inside of my script, to do this i am using "grep "^${USER}:" /etc/passwd", if the user exists, the script runs normally, but if the user doesn't exist, this command exists, and i don't want to exit on this case (i have to create the user and continue my installation). so what's the solution to make my script continue running ?? i tried to redirect the output of "grep" to a variable, but i still have the same problem :( thanks.

    Read the article

  • bash process uses 90% CPU, comes back on computer restart

    - by Sano
    I’ve replaced the old HDD of my late 2008 unibody MacBook (8 GB of RAM, running OS X 10.7.4) with an OCZ Vertex 3 SSD. After doing this, I've installed Lion and restored my data from a Time Machine backup. Everything is fine, except for a process named “bash” that permanently uses about 90 % CPU. If I kill it via Activity Monitor, everything goes back to normal, but unfortunately the process comes back every time I restart the computer. I've tried do zap the PRAM, reinstall 10.7.4 from the combo package, and even simply wait for more than 2 hours, but the problem is still here.

    Read the article

  • Mac OS X bash prompt bug?

    - by Memo
    I am trying to set my bash prompt to display the time and current directory in bold: export PS1="\[\e[1m\][\A] \w \$ \[\e[0m\]" This does apparently work, but when I use the command history (ctrl-r), after finding the command I was searching for and pressing enter, this line is not displayed correctly. Here is an example: [21:58] ~/Wyona/svn-repos/zwischengas $ (reverse-i-search)`ta': tail -F logs/log4j-cnode1.log becomes, after pressing enter: [21:58] ~/Wyona/svn-repos/zwischengas $ -F logs/log4j-cnode1.log Of course, this is not "really" a problem, since the command does work correctly, but it is still annoying. Does anybody know why this happens? And, more importantly, how to prevent/fix it? Thanks, Memo

    Read the article

  • Bash - Program is writing directly to terminal

    - by Salis
    Valve's dedicated server for the Source Engine (srcds_run) on Linux writes directly to the terminal, not stdout. I want to run it as an /etc/init.d daemon on Debian 6, and I'd like to redirect/capture the output to a file. How can I do that? And better yet, why would they output directly to the terminal, is there any benefit in doing that? I suppose I could start another bash instance just for srcds_run, but that seems like a dirty solution, and I still don't know how to redirect the output.

    Read the article

  • bash and arithmetic comparison: double quotes or not?

    - by Martin
    when comparing two integers in bash, do we have to put double quotes ? In the official document http://tldp.org/LDP/abs/html/comparison-ops.html I can read that double quotes should appear every time... But what is the differences in the following examples: [ "$VAR" -eq "1" ] [ $VAR -eq "1" ] [ "$VAR" -eq 1 ] [ $VAR -eq 1 ] As I am curious, a took a look at Ubuntu init scripts in /etc/init.d and there are many usage of arithmetic comparison in it, at least [ "$VAR" -eq "1" ] and [ $VAR -eq 1 ] are used... but it seems no one really "knows" what is the official way to do it. Thanks !

    Read the article

  • bash/sed/awk/etc remove every other newline

    - by carillonator
    a bash commands outputs this: Runtime Name: vmhba2:C0:T3:L14 Group State: active Runtime Name: vmhba3:C0:T0:L14 Group State: active unoptimized Runtime Name: vmhba2:C0:T1:L14 Group State: active unoptimized Runtime Name: vmhba3:C0:T3:L14 Group State: active Runtime Name: vmhba2:C0:T2:L14 Group State: active I'd like to pipe it to something to make it look like this: Runtime Name: vmhba2:C0:T1:L14 Group State: active Runtime Name: vmhba3:C0:T3:L14 Group State: active unoptimized Runtime Name: vmhba2:C0:T2:L14 Group State: active [...] i.e. remove every other newline I tried ... |tr "\nGroup" " " but it removed all newlines and ate up some other letters as well. thanks

    Read the article

  • Bash Script to Compress / Transfer / Remove Log Files

    - by Jason
    I am currently using chronolog to set log file names for Apache with date. They are in the following format: /WEB/LOGS/APACHE_ACCESS_YYYY-MM-DD.log /WEB/LOGS/APACHE_ERROR_YYYY-MM-DD.log I would like to have a script that runs on the first of every month and compresses the log files from the previous month, transfers them to another host (via SCP) and then deletes the compressed file. find . -name '*.log' -mtime +1 -type f I've found several examples like the one above that allow you to select files x days old, but I need all files from the previous month. I am the first to admit my bash scripting skills are weak so would really appreciate any help and guidance.

    Read the article

  • Where are variables sourced from in bash/redhat?

    - by Derek
    I am getitng something weird in my environment. I have a .bash_profile that only checks for .bashrc and then sources it. I have a JAVA_HOME in that file that is correctly setting the variable and exporting it. However, if I comment out the JAVA_HOME line in .bashrc, another JAVA_HOME is still showing up in my environment, different from the one i was setting in bashrc. Where is this other JAVA_HOME coming from? Thanks As it turns out - it seems like any shell I run is pulling in a JAVA_HOME from somewhere. I dont know what could be making this pull into csh, sh, bash, etc

    Read the article

  • Adding git branch to bash prompt on snow leopard

    - by crayment
    I am using this: $(__git_ps1 '(%s)') It works however it does not update when I change directories or checkout a new branch. I also have this alias: alias reload='. ~/.bash_profile' Sample run: user@machine:~/dev/rails$cd git_folder/ user@machine:~/dev/rails/git_folder$reload user@machine:~/dev/rails/git_folder(test)$git checkout master Switched to branch 'master' user@machine:~/dev/rails/git_folder(test)$reload user@machine:~/dev/rails/git_folder(master)$ As you can see it is being set correctly but only if I reload bash_profile. I have wasted way to much time on this. I am using bash on snow leopard. Please help!

    Read the article

  • BASH function not escaping control characters

    - by ehime
    Hey guys I have a function that I'm using to find stuff, but unfortunately anytime I pass it a control character ($intVal or testing : etc) it chokes. I was wondering what the fix was? I can understand that using $ or % or : etc in grep without escaping causes this issue, but since I'm passing it in by reference I'm not sure how to escape it... Anyway, here's the code. function ffind() { if [ $1 ] ; then find -type f | grep -ir '$1' * | grep -v '.svn' else echo "'$1' is not a valid resource" fi } Example(s): $ ffind $intVal '' is not a valid resource $ ffind "testing :" bash: [: testing: unary operator expected 'testing :' is not a valid resource

    Read the article

  • bash code in rc.local not excuting after bootup

    - by mrTomahawk
    Does anyone know why a system would not execute the script code within rc.local on bootup? I have a post configuration bash script that I want to run after the initial install of VMware ESX (Red Hat), and for some reason it doesn't seem to execute. I have the setup to log its start of execution and even its progress so that I can see how far it gets in case it fails at some point, but even when I look at that log, I am finding that didn't even started the execution of the script code. I already checked to see that script has execution permissions (755), what else should I be looking at? Here is the first few lines of my code: #!/bin/sh echo >> /tmp/configLog "" echo >> /tmp/configLog "Entering maintenance mode"

    Read the article

  • Last parameter of last command in bash in vi-mode

    - by Mo
    I have been convinced (over at stackoverflow) to use my beloved bash in vi mode. So far I got used to it quite well and I like it. However I really do miss one feature: In emacs-mode, you can enter the last parameter of the previous command by pressing "ESC ." (That is, press escape followed by the .) Is there a default binding to insert the last parameter in vi-mode? I wasn't able to find one and I really miss this command... Thanks a lot

    Read the article

  • Bash Shell Hangs on ?+Tab-complete

    - by michaelmichael
    I often use tab completion in Bash when completing directories, but I find that it hangs for an unacceptable amount of time if I accidentally include a question mark in the directory. I'd like to know why and how to prevent it if possible. Here's the scenario: I start a command and use the ~ key to represent home: ls ~?Desktop/co Oops! I held down the Shift for a split-second too long. I had intended for ? to be /. But (oh no!) muscle memory has already kicked in. I've hit the Tab before I noticed the mistake. Now I'm stuck waiting for the shell to beep angrily at me. Usually a minute or two. What happened? Why did the question mark cause it to hang and eventually beep? Any way to stop it from hanging?

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >