Search Results

Search found 18592 results on 744 pages for 'basic unix commands'.

Page 8/744 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Understanding Unix "Expect"

    - by zchtodd
    I don't think I properly understand the "expect" utility. While searching for a way to automate a build process that involves jar signing, I came across expect, and thought I could use it to supply a password to jarsigner (whether having a password in a shell script is a good idea I understand the risks of). expect "Enter Passphrase for keystore:" Instead of catching this, the jarsigner sat waiting at that line. Am I completely misunderstanding the point of "expect" and if I am, what can I use to achieve this effect?

    Read the article

  • Determining the Source of a Given File System Mount on Unix [migrated]

    - by phobos51594
    Background Recently I have run into a bit of a snag on my home FreeBSD server. I recently upgraded it to the latest stable release, and I have noticed some strange behavior with the /var partition. Originally, I had the system configured such that /var had its own partition with /var/run and /var/log in memory disks (/tmp, too). After the upgrade, I notice there is a new, fourth memory disk mounting directly to /var that I had not set up manually and is not in my fstab. It is only 28 megs or so in size and is causing problems when trying to update my ports collection. The ramdisk mounts atuomagically at boot and cannot be unmounted while in multi-user mode. If I drop to single user mode, I am able to unmount it without issue, however rebooting causes it to pop right back up. System specifications have been included at the end of the post. Question Is there any way to determine exactly what is mounting a given memory disk (or any filesystem, for that matter) after it has been mounted? Alternately, does anybody have any ideas what might have caused the new /var ramdisk to pop up? System Specification # uname -a FreeBSD sarge 9.1-PRERELEASE FreeBSD 9.1-PRERELEASE #0: Thu Nov 22 14:02:13 PST 2012 donut@sarge:/usr/obj/usr/src/sys/GENERIC i386 # df Filesystem 1K-blocks Used Avail Capacity Mounted on /dev/da0s1a 515612 410728 63636 87% / devfs 1 1 0 100% /dev /dev/da0s1d 515612 287616 186748 61% /var /dev/da0s1e 6667808 2292824 3841560 37% /usr /dev/md0 63004 32 57932 0% /tmp /dev/md1 3484 8 3200 0% /var/run /dev/md2 31260 8 28752 0% /var/log /dev/md3 31260 512 28248 2% /var <-- This # cat /etc/fstab # Device Mountpoint FStype Options Dump Pass# /dev/da0s1a / ufs rw,noatime 1 1 /dev/da0s1d /var ufs rw,noatime 2 2 /dev/da0s1e /usr ufs rw,noatime 2 2 md /tmp mfs rw,-s64M,noatime 0 0 md /var/run mfs rw,-s4M,noatime 0 0 md /var/log mfs rw,-s32M,noatime 0 0 Thank you in advance for any assistance.

    Read the article

  • unix script problem

    - by Darie Nicolae
    Hello everyone, I have a simple script which runs on a FreeBSD machine with the following code: #!/bin/sh `sed -i .bak '\:#start 172.0.0.3:,\:#end 172.0.0.3:d' /usr/local/etc/racoon/racoon.conf` echo $? It should delete a block of text between the two patterns. The problem is that if I run the sed command directly from shell it works, if i run the script the return code is 0. Why's that?

    Read the article

  • strange behaviour of grep in UNIX

    - by Happy Mittal
    When I type a command $ grep \h junk then shell should interpret \h as \h as two pairs of \ become \ each, and grep in turn, should interpret \h as \h as \ becomes \, so grep should search for a pattern \h in junk, which it is doing successfully. But it's not working for \$. Please explain why ?

    Read the article

  • strange behaviour of grep in UNIX

    - by Happy Mittal
    When I type a command $ grep \\h junk then shell should interpret \\h as \h as two pairs of \ become \ each, and grep in turn, should interpret \h as \h as \ becomes \, so grep should search for a pattern \h in junk, which it is doing successfully. But it's not working for \\$. Please explain why ?

    Read the article

  • Troubleshoot telnet connection from Windows 7 to UNIX

    - by Sujay Ghosh
    I am trying to connect to an Asterisk server in USA. I am using telnet < IP Address 5038 from India to USA. The person in USA is able to telnet to the IP address and port from USA , but I am not able to do it from India. We are on different networks. I am using Windows 7 Ultimate, and have enabled the Telnet client. I have also used Putty without any success. Can someone suggest me what can be the problem and how can this be resolved.

    Read the article

  • Unix tool for splitting archives

    - by Richo
    I'm dumping an svn repository to a giant USB disk that is formatted FAT due to necessity (treat this as unchangeable). It conks out when you try to create a file larger than 4 gb. I need a tool that I can pipe data to that will create files of arbitrary size that when catted together will be the original file. I can write a tool to do this, but if one already exists I'd rather use it. Cheers EDIT: A second look at the split man page looks like it might work.

    Read the article

  • remove files in unix

    - by nikhil
    Hi all, I often face this problem. I have a set of files in a folder and would like to delete all of them except a few. for example..if i have files named according to the date of creation(like 11-1-11.tar, 10-1-11.tar and so on). Now i would like to delete files like 10-1-11, 9-1-11 and so on but not some other files. Basically i would like to enforce what all should be deleted and what should be retained. How would i do this?

    Read the article

  • Server monitoring for medium scale UNIX network

    - by nbartolomeo
    I'm looking for suggestions for a good monitoring tools, or tools, to handle a mixed Linux (RedHat 4-5) and HPUX environment. Currently we are using Hobbit which is working reasonably well but it is becoming harder to keep track of what alerts are sent out for what servers. Features I'd like to see: Easy configuration of servers. The ability to monitor CPU, network, memory, and specific processes I've looked into Nagios but from what I have seen it won't be easy to set up the configuration for all of our servers ~200 and that without installing a plugin into each agent I won't be able to monitor processes.

    Read the article

  • UNIX "find" command, match literal "dot"

    - by Robottinosino
    I need files ending with ".pdf" or ".png"; here's my attempt: find /Users/robottinosino/Desktop/_PublishMe_ -type f -regex '.*[pdf|png]' this incorrectly includes files ending with "Apdf", "Zpdf", etc. (missing literal dot before file extension) I tried adjusting the pattern to: find /Users/robottinosino/Desktop/_PublishMe_ -type f -regex '.*\.[pdf|png]' but then no results are returned. Escaping the . with a backslash does not work. Why? [0] $ uname -a Darwin Robottinosino.local 10.8.0 Darwin Kernel Version 10.8.0: Tue Jun 7 16:33:36 PDT 2011; root:xnu-1504.15.3~1/RELEASE_I386 i386 Thanks!

    Read the article

  • Where to put unix sockets

    - by James Willson
    I am new to this, so sorry if its obvious. I am running a debian server and installing the likes of UWSGI, NGinx etc on there. The configurations keep talking about pointing to "sockets". In the build options I seem to be able to specify where the sockets for each program go. By default it looks like most of them go in /tmp/ (not all of them). Is this a good place for them to go? Im trying to keep things as organised as possible but just bunging them in my tmp directory doesnt seem like the best option.

    Read the article

  • after changing file from unix to dos it is having empty lines..how to handle this

    - by user2814717
    After converting file using **unix2dos** command it is having some empty lines. Please help me. How to handle this? I tried to delete empty lines as follow, but couldn't work. $ sed '/^$/d' /tmp/data.txt Hey following examples also didn't work. Pl help This is source data before using unix2dos. ID NAME DATE 1 BALA 09/23/2013 2 KRISHH 09/24/2013 3 billy 09/24/2013 After using unix2dos it is coming as ID NAME DATE 1 BALA 09/23/2013 2 KRISHH 09/24/2013 3 billy 09/24/2013 first and second record there is an empty line coming up..may be in bewteen data also Thanks

    Read the article

  • Limit unix users file access

    - by Michael
    Hello, I just created a new user on my server, but I only want this user to have access to var/www/ and all the files/folders inside that. They should be able to access no other files on the server except those. How would I do this? Thanks!

    Read the article

  • Understanding Unix Permissions (w/ ACL)

    - by Dr. DOT
    I am trying to set permissions on my server properly. Currently I have a number of directories and files chmod'd at 0777 -- but I am not comfortable with it being this way. So at the advice of a serverfault specialist, I had my hosting provider install ACL on my shared virtual server. When I FTP to the server as my FTP user account "abc", I can do everything I need to do (and rightfully so) because all my dirs and files are owned by "abc", the group is "abc", and the 1st octet is set to 7 (rwx). That much I get. But here's where it gets dark gray for me. PHP is set to user "nobody". so when someone browses on of my web pages that either ends in .php or has some embedded PHP, I assume the last octet controls the access. Because all my dirs and files are owned by "abc" and assigned to group "abc", if the last octet was a 4 (r--) then the server would let the browser read the file. If it were a 6 (rw-) then the server would let the browser also write to the file or directory, correct? what if the web document does not end in .php or does not have any PHP embedded? What is the user then? how can I use ACL to not set the permission to 6 (rw-) or even 7 (rwx)? [not sure what execute does or means] Just looking for some sort of policy settings to best lock down my dirs and files while allowing my PHP scripts to do uploads and write to files (so my users don't call me to tell me "permission denied". Ok, thanks to anyone out there willing to lend me a hand. It is greatly appreciated.

    Read the article

  • Complex (?) Unix Text Replace Command

    - by Matrym
    What's the command line equivalent of: For every file that contains "AAA" within its contents, find "BBB" and replace it with "CCC" Thus, the command would match and replace BBB in a file: <html> <head></head> <body> AAA Hello world! BBB </body> </html> But Not in a file: <html> <head></head> <body> Don't match me! BBB </body> </html> Thanks in advance!

    Read the article

  • Unix sort 10x slower with keys specified

    - by KenFar
    My data: It's a 71 MB file with 1.5 million rows. It has 6 fields, four of which are strings of avg. 15 characters, two are integers. Three of the fields are sometimes empty. All six fields combine to form a unique key - and that's what I need to sort on. Sort statement: sort -t ',' -k1,1 -k2,2 -k3,3 -k4,4 -k5,5 -k6,6 -o a_out.csv a_in.csv The problem: If I sort without keys, it takes 30 seconds. If I sort with keys, it takes 660 seconds. I need to sort with keys to keep this generic and useful for other files that have non-key fields as well. The 30 second timing is fine, but the 660 is a killer. I could theoretically move the temp directory to SSD, and/or split the file into 4 parts, sort them separately (in parallel) then merge the results, etc. But I'm hoping for something simpler since these results are so bad as-is. Any suggestions?

    Read the article

  • Unix 'find' command to include/exclude subdirectories

    - by Stan
    Say the folder structure looks like this: . |--folder1 |--subfolder1 |--subfolder2 |--subfolder2 |--folder2 |--subfolder1 |--subfolder2 |--subfolder2 |--folder3 |--subfolder1 |--subfolder2 I would like to find all files in subfolder2 only. I know I can just do this: $ find . -type f |grep subfolder2 But was wondering if find comes with a option to include/exclude given directories?

    Read the article

  • Remove all files except for a few, from a folder in Unix

    - by nikhil
    I often face this problem. I have a set of files in a folder and would like to delete all of them except a few. For example: I have files named according to the date of creation (like 11-1-11.tar, 10-1-11.tar and so on). Now I would like to delete files like 10-1-11, 9-1-11 and so on but not some other files. Basically I would like to enforce what all should be deleted and what should be retained. How would I do this?

    Read the article

  • Limiting memory usage and mimimizing swap thrashing on Unix / Linux

    - by camelccc
    I have a few machines that I machine that I use for running large numbers of jobs where I try to limit the number of jobs so as not to exceed the available RAM of the machine. Occasionally I mis-estimate how much memory some of the jobs will take, and the machine starts thrashing the swap file. I resolve this by sending the kill -s STOP to one of the jobs so that it can get swapped out. Does anyone know of a utility that will monitor a server for processes by a specific name, and then pause the one with the smallest memory footprint is the total memory consumption reaches a desired threshold so that the larger ones can run and complete with a minimum of swap file thrashing? Paused processes then need to be resumed once some existing processes have completed.

    Read the article

  • Utility to record IO statistics (random/sequential, block sizes, read/write ratio) in Unix

    - by Michael Pearson
    As part of provisioning our new server (see other SF) I'd like to find out the following: ratio of random to sequential reads & writes amount of data read & written at a time (pref in histogram form) I can already figure out our reads/writes on a per-operation and overall data level using iostat & dstat, but I'd like to know more. For example, I'd like to know that we're mostly random 16kb reads, or a lot of sequential 64kb reads with random writes. We're (currently) on an Ubuntu 10.04 VM. Is there a utility that I can run that will record and present this information for me?

    Read the article

  • Unix: Search for file contents

    - by Svish
    I find the find . -name "some-file" command very useful to list all files matching some file name in a folder. Is there anything similar I can use to list all files that contains string? If you needed to find all files in a directory that had a certain string of text in it, what would you use?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >