Search Results

Search found 7607 results on 305 pages for 'bash profile'.

Page 28/305 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • Bash script to open, read, and write then save....

    - by Alex Vo
    I'm new on this bash script thing. Can you show me some example on writing Bash script. I want to write a script that can read from a filename and save it to a variable; increment the value of the variable and write that variable back to the file and save it. This is what I have started and stuck on it so far. #!/bin/bash # if file exist #echo "Testing \ "$1"" if [ -f "$1" ]; then echo "$1 does exist" else echo "$1 does not exist!" echo "Creating $1" touch $1 echo "This is test" > $1 exit 1 fi #echo "Testing \ "$2"" if [ "$2" == "" ]; then echo "Enter the filename" elif [ -f "$2" ]; then echo "$2 Fille does exist" else echo "$2 File doesn't exist" echo "Creating $2" touch $2 exit 1 fi counter=1 echo -n "Enter a file name : " read file if [ ! -f $file ] then echo "$file not a file!" exit 1 fi

    Read the article

  • Has anyone found a (eg bash) shell terminal for Windows as good as the OS X one?

    - by Anentropic
    I am mostly using 'git-bash' which came with the Windows install of Git vcs. I think it is the same one that comes with Cygwin. It works fine technically but the UI sucks: have to right-click the title bar and go to Properties to change the window width most annoyingly... copy and paste and highlight with the mouse are equally cumbersome In contrast, the Terminal app that comes with OS X manages these with aplomb and makes it so much more comfortable to work with. You can even drag and drop a file on it and it will paste the file path in at your cursor! I have also tried: http://sourceforge.net/projects/win-bash http://www.steve.org.uk/Software/bash/ http://www.hamiltonlabs.com/cshell.htm None of these do copy and paste of text without cumbersome right-clicking. I am specifically looking for a Unix-flavoured shell in Windows so I don't have to use different shells between my home dev (Windows), the live server (Linux) or dev at the office (Mac). Yes I have Googled and haven't found one yet...

    Read the article

  • nvcc not found, but only when using sudo

    - by dsp_099
    I can't get ANYTHING working on linux. I'm trying to compile CudaMiner. sudo make: ypt-jane.o `test -f 'scrypt-jane.cpp' || echo './'`scrypt-jane.cpp mv -f .deps/cudaminer-scrypt-jane.Tpo .deps/cudaminer-scrypt-jane.Po nvcc -g -O2 -Xptxas "-abi=no -v" -arch=compute_10 --maxrregcount=64 --ptxas-options=-v -I./compat/jansson -o salsa_kernel.o -c salsa_kernel.cu /bin/bash: nvcc: command not found make[2]: *** [salsa_kernel.o] Error 127 make[2]: Leaving directory `/var/progs/CudaMiner' make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory `/var/progs/CudaMiner' make: *** [all] Error 2 So, kind of interesting: nvcc: nvcc fatal : No input files specified; use option --help for more information Whereas sudo nvcc: sudo: nvcc: command not found Huh?? I have identical exports listed in ~/.bashrc AND /etc/bash.bashrc. (Nvcc is located in: /usr/local/cuda-5.0/bin/nvcc) I also tried changing the current path, to no avail: $ sudo bash -c 'echo $PATH' /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin $ PATH=$PATH:/usr/local/cuda-5.0/bin/nvcc $ sudo bash -c 'echo $PATH' /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin Thanks in advance!

    Read the article

  • How to get shared bashed history among different tabs

    - by Luca Cerone
    I used the answer in http://unix.stackexchange.com/a/1292/41729 to enable real-time shared history among separate bash terminals. As explained in the answer above, this is achieved by adding: # avoid duplicates.. export HISTCONTROL=ignoredups:erasedups # append history entries.. shopt -s histappend # After each command, save and reload history export PROMPT_COMMAND="history -a; history -c; history -r; $PROMPT_COMMAND" This works fine if the bash shells are separate (e.g. opening different bash terminals using CTRL+ALT+T. However it doesn't work if I use tabs (from an open terminal `CTRL+SHIFT+T) rather than new windows. Why this difference in behaviour? How can I share the bash history also among various tabs?

    Read the article

  • sudo: source: command not found

    - by HorusKol
    I've been updating some of the default profile for bash, and saw from the tutorials I was following that I could reload the new profile with the new environment settings by using: source /etc/bash.bashrc The only thing is - the new environment variables were only available to my current user - and were ignored when I used sudo. They only became available to sudo when I closed my terminal session and rejoined. When I try to use: sudo source /etc/bash.bashrc I get the error: sudo: source: command not found Is there a simple way to load in the new bash profile settings for sudo without having to close the terminal and restart?

    Read the article

  • Tab completion COMP_WORDS bad array subscript

    - by Senthil Kumaran
    I have upgraded my Ubuntu to 10.04 and I am facing this problem of COMP_WORDS bad array subscript when I press TAB for certain completion. I thought, it is a bug with bash-completion package and I purged it. But even after that, I still face this. If it is a bug with bash package, how I can resolve it? https://bugs.launchpad.net/ubuntu/+source/bash-completion/+bug/366446 It is difficult for a developer to live with this bug in the system.

    Read the article

  • how to escape white space in bash loop list

    - by MCS
    I have a bash shell script that loops through all child directories (but not files) of a certain directory. The problem is that some of the directory names contain spaces. Here are the contents of my test directory: $ls -F test Baltimore/ Cherry Hill/ Edison/ New York City/ Philadelphia/ cities.txt And the code that loops through the directories: for f in `find test/* -type d`; do echo $f done Here's the output: test/Baltimore test/Cherry Hill test/Edison test/New York City test/Philadelphia Cherry Hill and New York City are treated as 2 or 3 separate entries. I tried quoting the filenames, like so: for f in `find test/* -type d | sed -e 's/^/\"/' | sed -e 's/$/\"/'`; do echo $f done but to no avail. There's got to be a simple way to do this. Any ideas? The answers below are great. But to make this more complicated - I don't always want to use the directories listed in my test directory. Sometimes I want to pass in the directory names as command-line parameters instead. I took Charles' suggestion of setting the IFS and came up with the following: dirlist="${@}" ( [[ -z "$dirlist" ]] && dirlist=`find test -mindepth 1 -type d` && IFS=$'\n' for d in $dirlist; do echo $d done ) and this works just fine unless there are spaces in the command line arguments (even if those arguments are quoted). For example, calling the script like this: test.sh "Cherry Hill" "New York City" produces the following output: Cherry Hill New York City Again, I know there must be a way to do this - I just don't know what it is...

    Read the article

  • Bash Shell Script: Nested Select Statements

    - by CCG121
    I have A Script that has a Select statement to go to multiple sub select statements however once there I can not seem to figure out how to get it to go back to the main script. also if possible i would like it to re-list the options #!/bin/bash PS3='Option = ' MAINOPTIONS="Apache Postfix Dovecot All Quit" APACHEOPTIONS="Restart Start Stop Status" POSTFIXOPTIONS="Restart Start Stop Status" DOVECOTOPTIONS="Restart Start Stop Status" select opt in $MAINOPTIONS; do if [ "$opt" = "Quit" ]; then echo Now Exiting exit elif [ "$opt" = "Apache" ]; then select opt in $APACHEOPTIONS; do if [ "$opt" = "Restart" ]; then sudo /etc/init.d/apache2 restart elif [ "$opt" = "Start" ]; then sudo /etc/init.d/apache2 start elif [ "$opt" = "Stop" ]; then sudo /etc/init.d/apache2 stop elif [ "$opt" = "Status" ]; then sudo /etc/init.d/apache2 status fi done elif [ "$opt" = "Postfix" ]; then select opt in $POSTFIXOPTIONS; do if [ "$opt" = "Restart" ]; then sudo /etc/init.d/postfix restart elif [ "$opt" = "Start" ]; then sudo /etc/init.d/postfix start elif [ "$opt" = "Stop" ]; then sudo /etc/init.d/postfix stop elif [ "$opt" = "Status" ]; then sudo /etc/init.d/postfix status fi done elif [ "$opt" = "Dovecot" ]; then select opt in $DOVECOTOPTIONS; do if [ "$opt" = "Restart" ]; then sudo /etc/init.d/dovecot restart elif [ "$opt" = "Start" ]; then sudo /etc/init.d/dovecot start elif [ "$opt" = "Stop" ]; then sudo /etc/init.d/dovecot stop elif [ "$opt" = "Status" ]; then sudo /etc/init.d/dovecot status fi done elif [ "$opt" = "All" ]; then sudo /etc/init.d/apache2 restart sudo /etc/init.d/postfix restart sudo /etc/init.d/dovecot restart fi done

    Read the article

  • BASH if conditions

    - by Daniil
    Hi, I did ask a question before. The answer made sense, but I could never get it to work. And now I gotta get it working. But I cannot figure out BASH's if statements. What am I doing wrong below: START_TIME=9 STOP_TIME=17 HOUR=$((`date +"%k"`)) if [[ "$HOUR" -ge "9" ]] && [[ "$HOUR" -le "17" ]] && [[ "$2" != "-force" ]] ; then echo "Cannot run this script without -force at this time" exit 1 fi The idea is that I don't want this script to continue executing, unless forced to, during hours of 9am to 5pm. But it will always evaluate the condition to true and thus won't allow me to run the script. ./script.sh [action] (-force) Thx Edit: The output of set -x: $ ./test2.sh restart + START_TIME=9 + STOP_TIME=17 ++ date +%k + HOUR=11 + [[ 11 -ge 9 ]] + [[ 11 -le 17 ]] + [[ '' != \-\f\o\r\c\e ]] + echo 'Cannot run this script without -force at this time' Cannot run this script without -force at this time + exit 1 and then with -force $ ./test2.sh restart -force + START_TIME=9 + STOP_TIME=17 ++ date +%k + HOUR=11 + [[ 11 -ge 9 ]] + [[ 11 -le 17 ]] + [[ '' != \-\f\o\r\c\e ]] + echo 'Cannot run this script without -force at this time' Cannot run this script without -force at this time + exit 1

    Read the article

  • Bash: how to simply parallelize tasks?

    - by NoozNooz42
    I'm writing a tiny script that calls the "PNGOUT" util on a few hundred PNG files. I simply did this: find $BASEDIR -iname "*png" -exec pngout {} \; And then I looked at my CPU monitor and noticed only one of the core was used, which is quite sad. In this day and age of dual, quad, octo and hexa (?) cores desktop, how do I simply parallelize this task with Bash? (it's not the first time I've had such a need, for quite a lot of these utils are mono-threaded... I already had the case with mp3 encoders). Would simply running all the pngout in the background do? How would my find command look like then? (I'm not too sure how to mix find and the '&' character) I if have three hundreds pictures, this would mean swapping between three hundreds processes, which doesn't seem great anyway!? Or should I copy my three hundreds files or so in "nb dirs", where "nb dirs" would be the number of cores, then run concurrently "nb finds"? (which would be close enough) But how would I do this?

    Read the article

  • BASH echo write mysql input

    - by jmituzas
    Have a bash menu where variables write to file for mysql input. heres what I have: echo "CREATE DATABASE '$mysqldbn'; #GRANT ALL PRIVILEGES ON *.* TO '$mysqlu'@'$myhost' IDENTIFIED BY '$mysqlup' WITH GRANT OPTION; GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX, ALTER, CREATE TEMPORARY TABLES, LOCK TABLES ON '$mysqldbn'.* TO '$mysqlu'@'$myhost' IDENTIFIED BY '$mysqlup'; GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX, ALTER, CREATE TEMPORARY TABLES, LOCK TABLES ON '$mysqldbn'.* TO '$mysqlu'@'$myip' IDENTIFIED BY '$mysqlup'; GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX, ALTER, CREATE TEMPORARY TABLES, LOCK TABLES ON '$mysqldbn'.* TO '$mysqlu'@'localhost' IDENTIFIED BY '$mysqlup'; GRANT SELECT, INSERT, UPDATE, DELETE, CREATE, DROP, INDEX, ALTER, CREATE TEMPORARY TABLES< LOCK TABLES on '$mysqldbn'.* TO '$mysqlu'@'$rip' IDENTIFIED BY '$mysqlup';" > nmysql.db mysql -u root -p$mypass < nmysql.db problem is to get variables to show I had to put them in single quotes, the single quotes show up as I want for instances like '$mysqlu'@'localhost'. But how can I remove the quotes and still get to use the variable in the instance like, CREATE DATABASE '$mysqldbn' ? Double quotes wont work either, I am at a loss. Thanks in advance, Joe

    Read the article

  • linux bash script: set date/time variable to auto-update (for inclusion in file names)

    - by user1859492
    Essentially, I have a standard format for file naming conventions. It breaks down to this: target_dateUTC_timeUTC_tool So, for instance, if I run tcpdump on a target of 'foo', then the file would be foo_dateUTC_timeUTC_tcpdump. Simple enough, but a pain for everyone to constantly (and consistently) enter... so I've tried to create a bash script which sets system variables like so: FILENAME=$TARGET\_$UTCTIME\_$TOOL Then, I can just call the variable at runtime, like so: tcpdump -w $FILENAME.lpc All of this works like a champ. I've got a menu-driven .sh which gives the user the options of viewing the current variables as well as setting them... file generation is a breeze. Unfortunately, by setting the date/time variable, it is locked to the value at the time of creation (naturally). I set the variable like so: UTCTIME=$(/bin/date --utc +"%Y%m%d_%H%M%Z") What I really need is either a way to create a variable which updates at runtime, or (more likely) another way to skin this cat. While scouring for solutions, I came across a similar issues... like this. But, to be honest, I'm stumped on how to marry the two approaches and create a simple, distributable solution. I can post the entire .sh if anyone cares to review (about 120 lines)

    Read the article

  • Bash script to find a directory, list it's contents and sub-folders info

    - by lithiumion
    Hi I want to write a script that will: 1- locate folder "store" on a *nix filesystem 2- move into that folder 3- print list of contents with last modification date 4- calculate sub-folders size This folder's absolute path changes from server to server, but the folder name remains the same always. There is a config file that contains the correct path to that folder though, but it doesn't give absolute bath to it. Sample Config: Account ON DIR-Store /hdd1 Scheduled YES ?According to the config file the absolute path would be "/hdd1/backup/store/" I need the script to grep the "/hdd1" or anything beyond the word "Config-Store", add "/backup/store/" to it, move into folder "store", print list of it's contents, and calculate sub-folders size. Until now I manually edit the script on each server to reflect the path to the "store" folder. Here is a sample script: #!/bin/bash echo " " echo " " echo "Moving Into Directory" cd /hdd1/backup/store/ echo "Listing Directory Content" echo " " ls -alh echo "*******************************" sleep 2 echo " " echo "Calculating Backup Size" echo " " du -sh store/* echo "********** Done! **********" I know I could use grep cat /etc/store.conf | grep DIR-Store Just don't know how to get around selecting the path, adding the "/backup/store/" and moving ahead. Any help will be appreciated

    Read the article

  • Not able to find scripts present in /etc/profile.d directory [on hold]

    - by priya
    I am using Red Hat Linux 6.0 ... using davinchi board. I have to change system clock resolution so I am changing (HZ) env var. For this I have written script so that I can change HZ = 1000 n insert that script in /etc/profile.d and write code for loop in /etc/profile so that while running as usual /etc/profile can load the scripts present in /etc/profile.d. But when I am logging into the system at root level then showing error as "-bash: ./etc/profile.d/resolution.sh(my script name): No such file or directory Also here why it is showing ./etc and not /etc . Is something related to that?? Also I tried to add script in /etc/init.d but still no change in value of HZ takes place. Please tell where to change so that this env var can get changed. The script(resolution.sh) written has :- #!/bin/bash export HZ=1000 The content of /etc/profile which I entered is: if [ -d /etc/profile.d ]; then for i in /etc/profile.d/*.sh; do if [ -r $i ]; then .$i fi done unset i fi And the output of grep command is -rw-r--r-- 1 root root 535 Feb 4 2004 profile -rwxr-xr-x 2 root root 4096 Feb 2 2004 profile.d

    Read the article

  • Multi valued profile property mapping to AD in Sharepoint

    - by keeno
    I'm trying to map my skills and responsibilities profile sections to one of the custom properties in Active Directory (extensionattribute1 , 2, etc). I'm entering comma seperated values in AD and it's importing the values fine but sees the comma seperated values as one value on import. i.e. 'C#,asp.net,javascript' rather than 'C#', 'asp.net', 'Javascript'. Any ideas. I'm almost there it's just not spliting the values correctly on import. thanks in Advance

    Read the article

  • How do I ask screen to behave like a standard bash shell?

    - by thornomad
    Just learned about the screen command on linux - it is genius. I love it. However, the actual terminal/prompt in screen looks and behaves differently than my standard bash prompt. That is, the colors aren't the same, tab completion doesn't seem to work, etc. Is there a way I can tell screen to behave just like a normal (at least, normal as in what I am used to) bash prompt ?

    Read the article

  • How could I make a bash script to execute apt-get?

    - by poz2k4444
    I'm trying to automatize some configurations I have with bash script, I've never done this before so I tried with something easy like a Hello World! and everything works just fine, but then I tried something like this: #!/bin/bash sudo su apt-get purge postfix and it doesn't do anything, I check and postfix is still installed, and at any time it asks for any entry of mine, I just tried with apt-get but I'll do things like ssh-keygen or even write files I guess with cat or something, how could I do the script working and also seeing what's going on?

    Read the article

  • Entity Framework Code-First to Provide Replacement for ASP.NET Profile Provider

    - by Ken Cox [MVP]
    A while back, I coordinated a project to add support for the SQL Table Profile Provider in ASP.NET 4 Web Applications.  We urged Microsoft to improve ASP.NET’s built-in Profile support so our workaround wouldn’t be necessary. Instead, Microsoft plans to provide a replacement for ASP.NET Profile in a forthcoming release. In response to my feature suggestion on Connect, Microsoft says we should look for something even better using Entity Framework: “When code-first is officially released the final piece of a full replacement of the ASP.NET Profile will have arrived. Once code-first for EF4 is released, developers will have a really easy and very approachable way to create any arbitrary class, and automatically have the .NET Framework create a table to provide storage for that class. Furthermore developer will also have full LINQ-query capabilities against code-first classes. “ The downside is that there won’t be a way to retrofit this Profile replacement to pre- ASP.NET 4 Web applications. At least there’ll still be the MVP workaround code. It looks like it’s time for me to dig into a CTP of EF Code-First to see what’s available.   Scott Guthrie has been blogging about Code-First Development with Entity Framework 4. It’s not clear when the EF Code-First is coming, but my guess is that it’ll be part of the VS 2010/.NET 4 service pack.

    Read the article

  • Can you make a Windows network default user profile NOT apply to a certain operating system?

    - by Jordan Weinstein
    I would like to create a network Default User account for Windows 7 only. This is on a Windows 2003 domain with servers from Windows 2000 to 2008 R2 and Windows XP on workstation side. We're about to do a full migration to Windows 7 and I'd like to start using the network default user profile functionality as we're not migrating user profiles over. Want everyone to start clean. I followed the simple steps from this page: http://support.microsoft.com/kb/973289 under the heading: "How to turn the default user profile into a network default user profile in Windows 7 and in Windows Server 2008 R2" but the problem is that profile would then apply to a new user\admin logging into a 2008 server. That's no good. Anyone have any ideas on how to limit what actually uses that network profile? I was thinking about setting deny permissions for all my admin\service accounts on that "\\dcserver\netlogon\Default User.v2" folder but then it might be timing out and cause other problems. Haven't tried yet as that seems like a bad way of making this work.

    Read the article

  • Utility for notifying a user that their roaming profile is getting too large to copy before shutdown?

    - by leeand00
    My users are having an issue with their roaming profiles getting too large and then their roaming profile is lost. I believe this is because this is because they are storing too much in their roaming profiles. Is there a program that can be installed in Windows, that will: Listen for a logoff event Check the size of their Roaming Profile against a size limit I set... If the roaming profile is too big, it will notify the user that they have to decrease the size of the profile. Does a program like this exist or does it need to written?

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >