Search Results

Search found 3548 results on 142 pages for 'unix'.

Page 24/142 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • subdomain/virtualhost problem on unix + apache

    - by Aaron
    Hello, I'm having a strangely difficult time setting up a subdomain (x.example.com). The main site works fine, but I get 404 errors attempting to hit x.example.com no matter how I set up the VirtualHost config. NameVirtualHost *:80 <VirtualHost *:80> ServerName www.example.com DocumentRoot /var/www/example.com/htdocs ServerAlias example.com </VirtualHost> <VirtualHost *:80> ServerName x.example.com ErrorLog /var/logs/x-error-log CustomLog /var/logs/x-access-log common DocumentRoot /var/www/x/htdocs </VirtualHost> As far as I can tell, this is a vanilla set up. Any suggestions would be appreciated.

    Read the article

  • execute a command in all subdirectories bash

    - by Luigi R. Viggiano
    I have a directory structure composed by: iTunes/Music/${author}/${album}/${song.mp3} I implemented a script to strip my mp3 bitrate to 128 kbps using lame (which works on a single file at time). My script looks like this 'normalize_mp3.sh': #!/bin/bash SAVEIFS=$IFS IFS=$(echo -en "\n\b") for f in *.mp3 do lame --cbr $f __out.mp3 mv __out.mp3 $f done IFS=$SAVEIFS This works fine, if I go folder by folder and execute this command. But I'd like to have a "global" command, like in 4DOS so I can run: $ cd iTunes/Music $ global normalize_mp3.sh and the global command would traverse all subdirs and execute the normalize_mp3.sh to strip all my mp3 in all subfolders. Anyone knows if there is a unix equivalent to the 4dos global command? I tried to play with find -exec but I just managed to get an headache.

    Read the article

  • Open source command line tools for indexing a large number of text files

    - by ergosys
    I'm looking for any open source command line tool or tools which will allow me to index and search a large number of plain text files. Approximate search would be a plus. The tool only needs to print the files that match, although some match context would be useful. A GUI tool isn't useful for my application, nor is anything that searches files one by one (grep for example). I'm basically targeting unix platforms (osx, linux, bsd). EDIT: I'm not interested in any sort of tool that is system-wide, or needs to run in the background. Basically, I want to build an index for a directory tree full of text files and then later be able to search against it. Preferably the index is one or a few files that I can specify the location of. Any ideas?

    Read the article

  • gnu screen - mouse does not work in nested screen session

    - by Matthew
    I started a screen session inside another screen session, both on my local machine. This is using cygwin, but I don't think it matters. I have tried via ssh to a real unix machine but the behaviour is the same. Mouse works great in the first screen session, I'm able to open vim with :set mouse=a and I can click to move the cursor or switch tabs, and the mouse wheel scrolls. But in the nested session it does not work, mouse is only useful for selecting terminal text that gets put in the clipboard, but is not able to interact with vim. I want this to work because I usually work with a local screen session, then ssh to a remote server and have a remote screen session running too (hence the nesting) and I like to scroll swiftly in vim by using the mouse wheel. Can anyone tell me why the mouse works in the first layer of screen but not in the second, nested screen session, and how I can make it work? Thanks in advance, Matthew

    Read the article

  • What do these "Cron Daemon" email errors mean?

    - by Meltemi
    Anyone know what this means? Getting one of these every minute in one user's inbox: From: Cron Daemon <[email protected]> Subject: Cron <joe@mail> /tmp/.d/update >/dev/null 2>&1 To: [email protected] Received: from murder ([unix socket]) by mail.domain.com (Cyrus v2.2.12-OS X 10.3) with LMTPA; Tue, 04 May 2010 10:35:00 -0700 shell-init: could not get current directory: getcwd: cannot access parent directories: Permission denied job-working-directory: could not get current directory: getcwd: cannot access parent directories: Permission denied

    Read the article

  • LPR Printing InputSlot Issues

    - by Jimmy
    I am printing from a Unix system to a Xerox Phaser 4510 (altho I have also had issues printing to a Xerox WorkCentre 7765). I am using LPR and trying to choose the InputSlot The lpoptions show the following options: InputSlot/Paper Tray: Tray1 *Tray2 Tray3 Tray4 ManualFeed Tray6 And here is a sample command that I am using: lpr -P printer -o InputSlot=Tray1 test.pdf.ps Here is the problem: Tray1 = Gives me Tray2 Tray2 = Gives me Tray3 Tray3 = Gives me Tray4 Tray4, ManualFeed, Tray6 = Tray2 (the default tray) If I change the default tray in either LPR, the printer settings, or both. LPR still sends tray2 as the default, where as printing from a windows machine or mac, would use the new default tray. I have also tried Tray0, and several other things, but I have not found a way to get it to print on tray 1. Any Ideas?

    Read the article

  • Powershell not displyaing Unix colors

    - by Paul Nathan
    I use various Linux programs on my machine; some of them have colorized output. However, Windows Powershell does not support Linux colors; it get a message like so ?[0m31m(which is the color control code), and renders that instead of the color. Is there a way around this?

    Read the article

  • Unix copy command that has a progress bar, but not as heavy as rsync

    - by Rory McCann
    I need to copy lots of files. Usually I use rsync because I pass it the -aP options and I can see (a) how many files are left to process and (b) how much of each individual file is copied. However rsync also does lots of things with checksums to verify that a file was copied. However I don't really need that now. But normal cp doesn't include the above mentioned count of files left, which is very helpful. Is there anything like cp that includes progress of how many files left, but isn't as heavy as rsync?

    Read the article

  • Reasonable automatic HTML to PDF conversion (in UNIX/Linux environment)

    - by Alex Balashov
    Is there a way to generate PDF documents from HTML files automatically in Linux where the PDF offers some kind of reasonable level of resemblance to the input file? A command-line tool - as opposed to an interactive GUI of some kind - is key. I have tried htmldoc and some related cousins, of course. But these tools are hopelessly stone-age; htmldoc doesn't support CSS at all. You won't find a lot of HTML documents these days that don't have at least some CSS styling. I don't really care about stupid effects or minor embellishments, but the issue is that CSS is at the core of most layouts these days; not many folks are using 6 layers of nested tables anymore. So, if the conversion tool has no grasp of CSS whatsoever, it's not just a matter of "the document doesn't look quite right"; it is likely to not meet the minimum standard of usability at all. It has been suggested to me by some folks to try to use the Gecko rendering engine to generate images that can be converted to PDFs, but I have no idea how one would go about doing this, let alone easily. I have no trouble believing that there are good commercial tools that do this, but I'm really looking for an open-source package if possible, as the endeavour itself is an open-source one and doesn't pay. Thanks in advance!

    Read the article

  • Give apache write access to DocumentRoot on dev server

    - by Abhi Beckert
    I've got apache running on my mac workstation (OS X 10.7, with the pre-installed copy of apache), and our web applications require write access to certain sections of the filesystem to run (usually just a tmp dir, but sometimes more than that). We have (literally) thousands of clients, and I want to be able to quickly grab a copy of any website's code, and have it "just work", however I always need to manually modify the unix permissions of specific directories after pulling a client's website out of source control (the list of directories varies from one client to another, as it has changed over the years). Since it's a dev server, firewalled off from the general internet, I would like to give apache/php write access to the entire DocumentRoot. How can I do this? I tried chmod 777 on the DocumentRoot, but if I create a directory inside it, the permissions are still 755 (owner: me, group: wheel). I think there should be a way to force all files created inside DocumentRoot to be 777 or perhaps 775, with the _www user added to the wheel group?

    Read the article

  • Performance monitoring on Linux/Unix

    - by ervingsb
    I run a few Windows servers and (Debian and Ubuntu) Linux and AIX servers. I would like to continously monitor performance on these systems in order to easily identify bottlenecks as well as to have an overview of the general activity on the servers. On Windows, I use Windows Performance Monitor (perfmon) for this. I set up these counters: For bottlenecks: Processor utilization : System\Processor Queue Length Memory utilization : Memory\Pages Input/Sec Disk Utilization : PhysicalDisk\Current Disk Queue Length\driveletter Network problems: Network Interface\Output Queue Length\nic name For general activity: Processor utilization : Processor\% Processor Time_Total Memory utilization : Process\Working Set_Total (or per specific process) Memory utilization : Memory\Available MBytes Disk Utilization : PhysicalDisk\Bytes/sec_Total (or per process) Network Utilization : Network Interface\Bytes Total/Sec\nic name (More information on the choice of these counters on: http://itcookbook.net/blog/windows-perfmon-top-ten-counters ) This works really well. It allows me to look in one place and identify most common bottlenecks. So my question is, how can I do something equivalent (or just very similar) on Linux servers? I have looked a bit on nmon (http://www.ibm.com/developerworks/aix/library/au-analyze_aix/) which is a free performance monitoring tool developed for AIX but also availble for Linux. However, I am not sure if nmon allows me to set up the above counters. Maybe it is because Linux and AIX does not allow monitoring these exact same measures. Is so, which ones should I choose and why? If nmon is not the tool to use for this, then what do you recommend?

    Read the article

  • What is the standard place for static library files on Unix/Ubuntu

    - by Max
    Hi, I am trying to install a library manually, well actually just put it in a sensible location preferably in my LIB path. I have a lib[...].a file and a bunch of headers pertaining to that static library file. If I look under /usr/lib/ I see only .so files, likewise for /lib/, /lib32/ etc. I figure I could chuck it in there, but is there any place where it can get cozy with other .a files or is that as good place as any? I'm not an library expert, but I'm pretty sure it won't matter functionally, but I'd like to learn conventional best practice. Also, where is the standard place to put the headers? Thanks!

    Read the article

  • Suppress EXT3-fs warning on mount

    - by STM
    I am familiar with output suppress on Unix machines, ie: cat /file/that/doesnt/exist > /dev/null 2>& However I can't seem to suppress the output of mount when an ext3 filesystem is mounted for the nth time, and it recommends an fsck. As it happens, fscks are run regularly by another machine, so these warning messages are needlessly interrupting the flow of output to my pretty bash script. These are the errors: # mount -t ext3 /dev/sda1 /mnt > /dev/null 2>& kjournald starting. Commit interval 5 seconds EXT3-fs warning: maximal mount count reached, running e2fsck is recommended EXT3 FS 2.4-0.9.19, 19 August 2002 on sd(8,1), internal journal EXT3-fs: mounted filesystem with ordered data mode. Can anyone shed some light on this? I'm clearly blocking both fd's, but somehow output is still getting through. This is GNU Bash v2.05a

    Read the article

  • Perform shell operation through secure shell

    - by Ben
    Is it possible to perform a shell operation from a bash script through a secure shell. Here is an example of why you may want to do this. Lets say you have a simple unix operating system that you need only build and run on, but you want to do all of the development on another machine. I want to write a bash script that has the following functionality: scp file to location on other machine ssh to other machine cd into correct directory make run program scp results to file on original computer exit ssh Is this remotely possible? (Pardon the Pun :p)

    Read the article

  • Grant HTTP access based on unix user group

    - by Sander Marechal
    Is it possible to grant network access or HTTP access based on a user's group? At my company we want to set up an internal composer server using Satis to manage packages for the projects we write (e.g. on repository.mycompany.com), with the packages themselves in our SVN server (svn.mycompany.com). We have several webservers with many different users on them. Some users should be able to reach the composer and SVN server. Some should not. Users that should be able to reach these servers all belong to the same group. How can I set up Apache on the Composer and SVN server to only grant access to those users in that group? Alternatively, can I set up the webservers in such a way that only users from that group are able to make a connection to our Composer and SVN servers? The best thing we have come up with so far is using SSL client certificates. We simply place a client certificate on all servers which can be used to access Composer and SVN. Only the right usergroup will have read access to the certificate. A bit clunky but it may work. But I'm looking for something better.

    Read the article

  • How to send SMS Using SMS Email Gateway internationally?

    - by user35259
    Hi All, I don't know if this is the right place to post my question but, anyways, i will just ask. I'm trying to send SMS messages from a unix-based machine by sending a mail to [email protected]. This works very fine and messages are received normally. What I want to ask about is, am I able to use this service to send SMS messages to an international number? What about the country code? How will I add it and the format needs ten digits only? Thanks

    Read the article

  • Determine/resolve filepath/alias of a certain command in the Windows command prompt

    - by porg
    How can I find out to which filepath (or alias) a certain command input will point to, in the Windows command prompt? Specifically Windows XP, info on other versions also appreciated! On Unix systems I simply use: $ which commandname /a/commandname Or: $ type -a commandname commandname is aliased to `/b/commandname' commandname is /a/commandname commandname is /b/commandname And I am simply looking for the equivalent in the Windows Shell (specifically Win XP). I came to this general question, from a specific issue: I had installed robocopy.exe (version 026), but the command line "robocopy" always triggers version 010, and I would like to determine where this command points to, in order to correct this mistake.

    Read the article

  • Tail the filename, not the file

    - by Craig Walker
    In UNIX (OS X BSD to be precise), I have a "tail -f" command on a log file. From time to time I want to delete this log file so I can more easily review it in my text editor. I delete the file, and then my program recreates it after new activity. However, my tail command (and anything else that was watching the old log file) doesn't update; it's still watching the old, deleted log file. I think I understand why this is (file names simply being pointers to blocks of file data). I'd like to know how I can work around this. Ideally, my tail command (and anything else I point to the file) would be able to read the data from the new file when the file name has been deleted and recreated. How would I do this?

    Read the article

  • File doesn't exist in Linux although it's located in Terminal

    - by Mazen Ayman
    I'm a bit new to unix/linux environment, but I have a small problem. I'm using "locate" to find the path of a file I need, it gives me the path for it, but the file doesn't exist in that path, like that: locate test1.txt /home/user/test files/text1.txt /home/user/test1.txt~ "test files" directory is where I was keeping the file and I copied it to the home directory once but I deleted it, no idea what it keeps telling me there is still a tmp file for it. it worth mentioning that I used the command: locate test1.txt~ |xargs -n1 rm to remove that tmp file, but maybe that what caused the problem. I tried to show hidden files, and check for temp files, didn't find it either. any clue what happened?

    Read the article

  • Incremental backup services with change only charges?

    - by wowowewah
    I'm looking for online backup services that provide incremental, change-only backups. I'm looking to transfer as little data as possible and would like to find a service that provides full backups every week along with incremental backups every day. Are there any specialist companies that deal with this or do I just use standard backup ones? Any recommendation appreciated. To expand on this Im looking for software/services which work on Unix. I guess Linux is fine aswell as FreeBSDs Linux compatibility layer should run it. Oh and command line would be ideal and not require the use of X Window. Thanks.

    Read the article

  • Disk space profiling in Unix

    - by user1677770
    I'm looking for a tool to summarize how disk space is being used on very large partitions. Our file system is around 950TB, mostly broken up into 20TB partitions. There are some really nice graphical tools for visualising these file spaces: http://www.disksavvy.com/disksavvy_screenshots.html http://methylblue.com/filelight/ But I'm really not sure how well they will scale. Does anybody have any experience of these tools and can make any recommendations? Even something that parses and summarises a really big du output would be a good start.

    Read the article

  • Run script when shutting down ubuntu before the logged in user is logged out

    - by Travis
    I'm writing a script to backup some local directories on a unix machine (Ubuntu) to a samba drive. The script works fine and I've got it running at shutdown and restart using the method described at http://en.kioskea.net/faq/3348-ubuntu-executing-a-script-at-startup-and-shutdown It works by placing the backup script into the /etc/rc6.d and /etc/rc0.d directories. However there is a problem. After looking at the scripts logfile it seems to be run after the user is logged out. We are using LDAP authentication and when the user logs out, the system cannot backup to their samba share. Does anyone know of anyway to run the script before the user is logged out?

    Read the article

  • Linux Security/Sysadmin Courses in London?

    - by mister k
    Hi, My employer has offered to send me on a couple of training courses and I'm just looking for some recommendations. I'm mainly looking to improve my security and general sysadmin skills. I would like to do something focused on UNIX as I mainly work with Linux boxes (but also a couple of FreeBSD boxes). I don't want to do a study-from-home course, so I would need to find somewhere based in London. It would be great to hear from anyone who has some experience with this kind of course. The courses I've found so far are: www.learningtree.co.uk/courses/uk433.htm www.city.ac.uk/cae/cfa/computing/systems_it/linux.html www.city.ac.uk/cae/cfa/computing/systems_it/unix_tools_ss.html I'm not sure the City University courses are advanced enough as I already have experience... Thanks!

    Read the article

  • Where to download unix command?

    - by person
    I tried to run mvdir earlier and it said command not found. I then ran a search for it and still not found. Is there a place I can download the script for the command, and is there any information I should know post-download to get it to work?

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >