Search Results

Search found 134 results on 6 pages for 'piping'.

Page 3/6 | < Previous Page | 1 2 3 4 5 6  | Next Page >

  • Lightweight HTTP application/server for static content

    - by PartlyCloudy
    Hi, I am in need of a scalable and performant HTTP application/server that will be used for static file serving/uploading. So I only need support for GET and PUT operations. However, there are a few extra features that I need: Custom authentication: I need to check credentials against a database for each request. Thus I must be able to integrate propietary database interaction. Support for signed access keys: The access to resources via PUT should be signed using a key like http://uri/?key=foo The key then contains information about the request like md5(user + path + secret) which allows me to block unwanted requests. The application/server should allow me to check for this. Performance: I'd like to avoid piping content as much as possible. Otherwise the whole application could be implemented in Perl/etc. in a few lines as CGI. Perlbal (in webserver mode) looks nice, however the single-threaded model does not fit with my database lookup and it does also not support query strings. Lighttp/Nginx/… have some modules for these tasks, however it is not feasible putting everything together without ending up writing own extensions/modules. So how would you solve this? Are there other leightweight webservers available for this? Should I implement an application inside of a webserver (i.e. CGI). How can I avoid/speed up piping content between the webserver and my application. Thanks in advance!

    Read the article

  • how to check open ports of bunch of website at once with nmap/linux?

    - by austin powers
    hi , I want to use nmap in such a way that I could check bunch of server's port at once for checking whether their particular port is open or not? right now I have 10 ip addresses but in future this could be more . I know the very basic command in linux like cat/nano/piping but I don't know how can I feed to nmap the list of my servers to open them one by one and return the result.

    Read the article

  • Multi level or progressive or incremental file search

    - by iraSenthil
    I am looking for a GUI tool in Windows, where I can do search and pass the result to next search and continue. I know I can do this in command line by piping one search result to another, but I am looking for a GUI tool. Here is a sample search, I would like to find all files that has extension ".java". From the result, find all files that has a specific word. From the result, select few files, and search only those files with another keyword.

    Read the article

  • Why some user functions don't get recognised by bash?

    - by strapakowsky
    I can define a function like: myfunction () { ls -R "$1" ; } And then myfunction . just works. But if I do echo "myfunction ." | sh echo "myfunction ." | bash the messages are: sh: myfunction: not found bash: line 1: myfunction: command not found Why? And how can I call a function that comes from a string if not by piping it to sh or bash? I know there is this command source, but I am confused of when I should use source and when sh or bash. Also, I cannot pipe through source. To add to confusion, there is this command . that seems to have nothing to do with the "." that means "current directory".

    Read the article

  • Interviewing someone for general unix skills

    - by Christophe Vanfleteren
    How would you test a developer that claims to have *nix shell experience (just to be clear, we don't want to test if someone can develop on *nix, only that they know their way around the command line). I was thinking about making them solve a problem of getting information out of log files, which would involve some basics like cat, grep, cut, ... combined with piping. What other basic knowledge would you ask for? Once again, this isn't for interviewing someone who will develop for *nix systems, and also not for *nix system admins, but just for regular developers that sometimes need to do some work on a *nix system.

    Read the article

  • Attempting to GREP details of a Java error

    - by BOMEz
    I'm running Ubuntu 11 and I'm having some issues with grep. I have a shell script (see below) which essentially checks if a certain Java program of mine is running, if not it runs it. That part works out great! If my Java application throws any kind of exception however I would like to capture that information and email it to myself. How can I go about checking to see if the call to java -jar /bin/MyApp.jar fails? I tried piping it to grep, but that doesn't seem to work. Below is the full script that I've written: #Check if MyApp.jar is running, if not run it. if [ $(ps aux | grep 'java' | grep -v grep | wc -l | tr -s "\n") -eq 0 ] then echo "PacketCapture Starting...\n" java -jar /bin/MyApp.jar echo "PacketCapture Started.\n" else echo "PacketCapture already running.\n" fi

    Read the article

  • pipe multiple files (gz) into c program,

    - by monkeyking
    Ive written a cprogram that works when i pipe data into my program using stdin like gunzip -c IN.gz|./a.out If I want to run my program on a list of files I can do something like for i `cat list.txt` do gunzip -c $i |./a.out done But this will start my program 'number of files' times. I'm interested in piping all the files into the same process run. Like doing for i `cat list.txt` do gunzip -c $i >>tmp done cat tmp |./a.out thanks.

    Read the article

  • How do I check if my program has data piped into it.

    - by monkeyking
    Im writing a program that should read input via stdin, so I have the following contruct. FILE *fp=stdin; But this just hangs if the user hasn't piped anything into the program, how can I check if the user is actually piping data into my program like gunzip -c file.gz |./a.out #should work ./a.out #should exit program with nice msg. thanks

    Read the article

  • Using terminal to record/save a data stream

    - by jonhurlock
    I want to be able to save a data stream which i am returning using the curl command. I have tried using the cat command, and piping it the curl command, however i'm doing it wrong. The code im currently using is: cat > file.txt | curl http://datastream.com/data Any help would be appreciated.

    Read the article

  • [MS-DOS] Read command-line parameters to .bat from file

    - by John
    I have a build.bat file which uses %1 internally... so you might call: build 1.23 I wanted it to read the parameter from a separate file, so I tried putting "1.23" in version.txt and doing: build < version.txt But it doesn't work. Isn't this how piping works? Is what I want possible and if so how?

    Read the article

  • How can I change the color of build output in a shell window?

    - by Tim Gradwell
    I have a build process which runs from a batch file. It produces a large volume of text. Sometimes it prints the word "Error" or "Warning" followed by a message. The errors and warnings are getting lost among a sea of text. Can I highlight those words in a different color, maybe in a dos window, or a cygwin shell window, possibly by piping them through some string manipulation program before posting them to the screen? Thanks.

    Read the article

  • disparity between `top`'s given CPU % and process CPU usage total

    - by intuited
    I've noticed that there are sometimes (large) differences between the reported total CPU usage and a summation of the per-process CPU utilization given by apps like top and wmtop. As an example: I recently ran a git filter-branch --index-filter on a fairly large repo, with the index-filter command piping git ls-files through a grep filter and into xargs git rm --cached. This took a few minutes to run; while it was going I noticed that both wmtop and top were displaying a high (above 50% on my 2-core machine) total CPU usage, but that neither showed any individual processes which were using a significant amount of CPU time. Are some processes not shown in the process list? What sorts of processes are these, and is there a way to find out how much CPU time they are using?

    Read the article

  • v4l - capture and watch at the same time

    - by John Barrett
    Capturing v4l and line-in audio using mencoder works very well, but I would like to record real-time gameplay video from consoles plugged into the video card. I've used xawtv for this (Works quite well, can preview and record in real time), but when I enable any deinterlacing or aspect ration options the video fails to record. I have to record raw and re-encode the video with the appropriate filters later to get something workable. Other things I have tried: tvtime with xvidcap and jack audio capture - xvidcap drops frames and muxing the audio is impossible as it will go out of sync (I have not found muxer options that work to force a correct frame rate) mencoder capture to file, attempt to pipe tail of file to mplayer... mencoder works great, piping the file is far too heavy to attempt gameplay. Soooo, v4l capture and preview simultaneously, recommendations?

    Read the article

  • How can I use scp without providing a password

    - by Tim
    I have asked a question before in here. My question was that I tried to give my password to scp via piping echo mypassword | scp [email protected]:project/* ~/project/ However it still asks me to manually input the password. How should I specify mypassword to scp in commands? I still don't understand one of the replies. what is a input stream by TTY, for example? What are the common ways for a input stream? How to know what type of input stream a stdin input to a command is? For example that of ssh/scp.

    Read the article

  • Netcat server output with multiple greps

    - by Sridhar-Sarnobat
    I'm trying to send some data from my web browser to a txt file on another computer. This works fine: echo 'Done' | nc -l -k -p 8080 | grep "GET" >> request_data.txt Now I want to do some further processing before writing the http request data to my txt file (involving regex maniuplation). But if I try to do something like this nothing is written to the file: echo 'Done' | nc -l -k -p 8080 | grep "GET" | grep "HTTP" >> request_data.txt (for simplicity of explanation I've used another grep instead of say awk) Why does the 2nd grep not get any data from the output of the first grep? I'm guessing piping with netcat works differently to what I've assumed to get this far. How do I perform a 2nd grep before writing to my txt file? My debugging so far suggests: It is nothing to do with stderr vs stdout Parentheses don't help

    Read the article

  • Is there a serious issue with setting the SUID bit on tcpdump?

    - by Dean
    I'm running tcpdump on a remote machine, and piping the output to Wireshark on my local machine over SSH. In order to do this, I had to set the SUID bit on tcpdump. For background, the remote machine is an Amazon EC2 running "Amazon Linux AMI 2012.09". On this image, there is no root password, and it is not possible to log in as root. You can't use sudo without a TTY, and therefore you have to set the SUID. What are the practical risks of setting this bit on tcpdump? Is there any need to be paranoid? Should I unset it whenever I'm not capturing?

    Read the article

  • Mapping a Piped Shell Command in Vim

    - by michaelmichael
    In a previous question I asked about mapping evaluated code to a new window in MacVim. I got a great solution, but it presented another question: How can I map a key command in my .vimrc that involves piping output in the shell? As a simple example, let's say I wanted to pipe the results of ls -a to a new MacVim window. From the Vim command line I can enter !ls -a | mvim -, and the results will appear in a new window. Great! Now, I add that to my .vimrc: nmap <Leader>r :w !ls | mvim<CR> Vim now throws an error every time I try to source my .vimrc, which reads as follows: E492: Not an editor command: mvim<CR> Any ideas on how to overcome this?

    Read the article

  • Colour output piped to less

    - by mmacaulay
    Operating system: Mac OS 10.6.2 I'd like to be able to see colour output when piping certain commands through less. Two examples: I've got ls aliased to ls --color=auto, so I'd like to be able to see colour when I do this: ls -l | less I've also got the color extension turned on in Mercurial, so I'd like to see colour output from: hg diff | less and hg st | less After some googling, it seems like some versions of less support either -r or -R to make this work, but no dice for me. I can't see anything in the man page that looks like what I need. (-r or -R SEEM to be the right options, but again, they don't seem to work)

    Read the article

  • bash disable line wrap without truncation

    - by Eric Huang
    I am using a template heavy library in c++ and need to understand the template errors. Reading line wrapped template errors is a serious pain. Is there a way to disable line wrapping in bash without also truncating the output. Additionally, is there a way to do horizontal scrolling on the output. I have seen this answer, how to make bash not to wrap output?, but the output is truncated. The solution doesn't have to be bash targeted, if there is method for this using another shell, tmux, piping make output to another program, compiling from within vim, etc, I'll use it. (Except for copy-pasting into gedit)

    Read the article

  • Power cut during Ubuntu upgrade to 10.04 - boots to command line, apt-get and dpkg do not work.

    - by Macha
    I was upgrading Ubuntu to 10.04, when a tripswitch tripped, cutting power to the computer. When it was restarted, it booted into a command line prompt. Google tells me to try: sudo dpkg --configure -a This gets me a lot of output that ends with a list of packages. I can't tell you what the output is, as piping the output to more/less does not work (still just all scrolls by and moves to next prompt), and redirecting it to a file just results in an empty file. Google also suggested: sudo apt-get install -f This also didn't work. Is a fresh install the only solution at this point?

    Read the article

  • How can I sort du -h output by size

    - by Tom Feiner
    I need to get a list of human readable du output. However, du does not have a "sort by size" option, and piping to "sort" doesn't work with the human readable flag. For example, running: du | sort -n -r Outputs a sorted disk usage by size (descending): du |sort -n -r 65108 . 61508 ./dir3 2056 ./dir4 1032 ./dir1 508 ./dir2 However, running it with the human readable flag, does not sort properly: du -h | sort -n -r 508K ./dir2 64M . 61M ./dir3 2.1M ./dir4 1.1M ./dir1 Does anyone know of a way to sort du -h by size?

    Read the article

  • Modern open source NIDS/HIDS and consoles?

    - by MattC
    Years back we set up an IDS solution by placing a tap in front of our exterior firewall, piping all the traffic on our DS1 through an IDS box and then sending the results off to a logging server running ACiD. This was around 2005-ish. I've been asked to revamp the solution and expand on it and looking around, I see that the last release of ACiD was from 2003 and I can't seem to find anything else that seems even remotely up-to-date. While these things may be feature complete, I worry about library conflicts, etc. Can anyone give me suggestions for a Linux/OpenBSD based solution using somewhat modern tools? Just to be clear, I know that Snort is still actively developed. I guess I'm more in the market for a modern open-source web console to consolidate the data. Of course if people have great experiences with IDS' other than Snort I'm happy to hear about it.

    Read the article

< Previous Page | 1 2 3 4 5 6  | Next Page >