Search Results

Search found 5456 results on 219 pages for 'named pipes'.

Page 18/219 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Difference between piping a file to sh and calling a shell file

    - by Peter Coulton
    This is what was trying to do: $ wget -qO- www.example.com/script.sh | sh which quietly downloads the script and prints it to stdout which is then piped to sh. This unfortunately doesn't quite work, failing to wait for user input a various points, aswell as a few syntax errors. This is what actually works: $ wget -qOscript www.example.com/script.sh && chmod +x ./script && ./script But what's the difference? I'm thinking maybe piping the file doesn't execute the file, but rather executes each line individually, but I'm new to this kind of thing so I don't know.

    Read the article

  • Email pipe to php script working only sometimes

    - by Rixius
    I have a php pipe script that takes an attached *.csv from an email and saves and parses it. When the email is sent from where it is supposed to be coming from, it silently errors, however, when I take that same email and resend it from my address it goes through just fine. is there any simple reason it could be doing this?

    Read the article

  • how to feed a file to telnet

    - by knittl
    hello community, understanding http and headers i played around with telnet to send requests. to not type everything again and again and again i thought i'd write a small textfile with all the commands i need. my file is as simple as follows: GET /somefile.php HTTP/1.1 Host: localhost i then try to feed it to telnet with io-redirection: $ telnet localhost 80 < telnet.txt but all output i get is Trying ::1... Connected to localhost. Escape character is '^]'. Connection closed by foreign host. what am i doing wrong?

    Read the article

  • Writing/Reading struct w/ dynamic array through pipe in C

    - by anrui
    I have a struct with a dynamic array inside of it: struct mystruct{ int count; int *arr; }mystruct_t; and I want to pass this struct down a pipe in C and around a ring of processes. When I alter the value of count in each process, it is changed correctly. My problem is with the dynamic array. I am allocating the array as such: mystruct_t x; x.arr = malloc( howManyItemsDoINeedToStore * sizeof( int ) ); Each process should read from the pipe, do something to that array, and then write it to another pipe. The ring is set up correctly; there's no problem there. My problem is that all of the processes, except the first one, are not getting a correct copy of the array. I initialize all of the values to, say, 10 in the first process; however, they all show up as 0 in the subsequent ones. for( j = 0; j < howManyItemsDoINeedToStore; j++ ){ x.arr[j] = 10; } Initally: 10 10 10 10 10 After Proc 1: 9 10 10 10 15 After Proc 2: 0 0 0 0 0 After Proc 3: 0 0 0 0 0 After Proc 4: 0 0 0 0 0 After Proc 5: 0 0 0 0 0 After Proc 1: 9 10 10 10 15 After Proc 2: 0 0 0 0 0 After Proc 3: 0 0 0 0 0 After Proc 4: 0 0 0 0 0 After Proc 5: 0 0 0 0 0 Now, if I alter my code to, say, struct mystruct{ int count; int arr[10]; }mystruct_t; everything is passed correctly down the pipe, no problem. I am using READ and WRITE, in C: write( STDOUT_FILENO, &x, sizeof( mystruct_t ) ); read( STDIN_FILENO, &x, sizeof( mystruct_t ) ); Any help would be appreciated. Thanks in advance!

    Read the article

  • pipe multiple files (gz) into c program,

    - by monkeyking
    Ive written a cprogram that works when i pipe data into my program using stdin like gunzip -c IN.gz|./a.out If I want to run my program on a list of files I can do something like for i `cat list.txt` do gunzip -c $i |./a.out done But this will start my program 'number of files' times. I'm interested in piping all the files into the same process run. Like doing for i `cat list.txt` do gunzip -c $i >>tmp done cat tmp |./a.out thanks.

    Read the article

  • How do I check if my program has data piped into it.

    - by monkeyking
    Im writing a program that should read input via stdin, so I have the following contruct. FILE *fp=stdin; But this just hangs if the user hasn't piped anything into the program, how can I check if the user is actually piping data into my program like gunzip -c file.gz |./a.out #should work ./a.out #should exit program with nice msg. thanks

    Read the article

  • QLocalSocket and QLocalServer in browser plugins

    - by kambamsu
    Hi, I have a simple doubt. Does the ipc mechanism in qt work when we use it for developing browser plugins? The reason i ask this is that I can easily get the QLocalSocket and QLocalServer communication to work in a qt application, but when i write a similar piece of code in a browser plugin dll i see that the server does not accept a new connection at all. This is what i do in the server: server = new QLocalServer(this); if( !server->listen("myServer")) { writeFile("Listen failed"); } connect(server, SIGNAL(newConnection()), this, SLOT(handleConn()),Qt::QueuedConnection); and this is what i do in the client: client = new QLocalSocket(this); client->abort(); QObject::connect(client,SIGNAL(connected()),this,SLOT(connClient()),Qt::QueuedConnection); client->connectToServer("myServer"); after i call connectToServer, my client emits the connected() signal and the connClient() slot is called. But, on the server side, there is no signal emitted. It doesn't seem to be receiving any connection at all. Any help would be appreciated. Thanks

    Read the article

  • C check before writing to closed pipe

    - by Gary
    Is there an easy way to check if a pipe is closed before writing to it in C? I have a child and parent process, and the parent has a pipe to write to the child. However, if the child closes the pipe and the parent tries to read - I get a broken pipe error. So how can I check to make sure I can write to the pipe, so I can handle it as an error if I can't? Thanks!

    Read the article

  • linux piping ( convert -> pdf2ps -> lp)

    - by Bor
    Ok, so I can print a pdf doing: pdf2ps file.pdf - | lp -s But now I want to use convert to merge several pdf files, I can do this with: convert file1.pdf file2.pdf merged.pdf which merges file1.pdf and file2.pdf into merged.pdf, target can be replaced with '-'. Question How could I pipe convert into pdf2ps and then into lp though?

    Read the article

  • Is there any tutorial on connecting .NET 4 Pipeline with Pipeline from some C\C++ programm?

    - by Ole Jak
    Is there any tutorial on connecting .NET 4 Pipeline with Pipeline from some C\C++ programm? For example how to get data from VLC Pipeline output ... I mean VLC docs say that there command line args can eat name of pipe like YUV video output --yuv-file=<string> How to give such name to pipe created in your .Net programm so to be able to give it to other programms or resive from other programms?

    Read the article

  • Supporting Piping (A Useful Hello World)

    - by blastthisinferno
    I am trying to write a collection of simple C++ programs that follow the basic Unix philosophy by: Make each program do one thing well. Expect the output of every program to become the input to another, as yet unknown, program. I'm having an issue trying to get the output of one to be the input of the other, and getting the output of one be the input of a separate instance of itself. Very briefly, I have a program add which takes arguments and spits out the summation. I want to be able to pipe the output to another add instance. ./add 1 2 | ./add 3 4 That should yield 6 but currently yields 10. I've encountered two problems: The cin waits for user input from the console. I don't want this, and haven't been able to find a simple example showing a the use of standard input stream without querying the user in the console. If someone knows of an example please let me know. I can't figure out how to use standard input while supporting piping. Currently, it appears it does not work. If I issue the command ./add 1 2 | ./add 3 4 it results in 7. The relevant code is below: add.cpp snippet // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing if (numbers.size() > 0) { double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } std::cout << sum << std::endl; } else { double input; // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> input; std::cout << input << std::endl; } } // ... MORE IRRELEVANT CODE ... So, I guess my question(s) is does anyone see what is incorrect with this code in order to support piping standard input? Are there some well known (or hidden) resources that explain clearly how to implement an example application supporting the basic Unix philosophy? @Chris Lutz I've changed the code to what's below. The problem where cin still waits for user input on the console, and doesn't just take from the standard input passed from the pipe. Am I missing something trivial for handling this? I haven't tried Greg Hewgill's answer yet, but don't see how that would help since the issue is still with cin. // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> arg; std::cout << arg << std::endl; } std::cout << sum << std::endl; // ... MORE IRRELEVANT CODE ...

    Read the article

  • UNIX FIFO: How to allow only one writer/reader pair to use a FIFO?

    - by Max Krug
    Hi! I've written two programs: the first, the "writer", creates a FIFO and writes data into it. The second one, the "reader" runs in background and looks for data in the FIFO. Once data is there, the reader reads it out. If I start e.g. two writers and two readers, they all can write/read into/from the same FIFO. How can I restrict it for 3rd and 4th readers/writers to use the FIFO and allow only one writer and one reader to use the FIFO? thanks a lot. -- kind regards, Max Krug

    Read the article

  • How do I redirect stdin/stdout when I have a sequence of commands in Bash?

    - by Tom
    I've currently got a Bash command being executed (via Python's subprocess::Popen) which is reading from stdin, doing something and outputing to stdout. Something along the lines of: pid = subprocess.Popen( ["-c", "cmd1 | cmd2"], stdin = subprocess.PIPE, stdout = subprocess.PIPE, shell =True ) output_data = pid.communicate( "input data\n" ) Now, what I want to do is to change that to execute another command in that same subshell that will alter the state before the next commands execute, so my shell command line will now (conceptually) be: cmd0; cmd1 | cmd2 Is there any way to have the input sent to cmd1 instead of cmd0 in this scenario? I'm assuming the output will include cmd0's output (which will be empty) followed by cmd2's output. cmd0 shouldn't actually read anything from stdin, does that make a difference in this situation? I know this is probably just a dumb way of doing this, I'm trying to patch in cmd0 without altering the other code too significantly. That said, I'm open to suggestions if there's a much cleaner way to approach this.

    Read the article

  • How do you pipe output from a Ruby script to 'head' without getting a broken pipe error

    - by dan
    I have a simple Ruby script that looks like this require 'csv' while line = STDIN.gets array = CSV.parse_line(line) puts array[2] end But when I try using this script in a Unix pipeline like this, I get 10 lines of output, followed by an error: ruby lib/myscript.rb < data.csv | head 12080450 12080451 12080517 12081046 12081048 12081050 12081051 12081052 12081054 lib/myscript.rb:4:in `write': Broken pipe - <STDOUT> (Errno::EPIPE) Is there a way to write the Ruby script in a way that prevents the broken pipe exception from being raised?

    Read the article

  • Bash: Is it ok to use same input file as output of a piped command?

    - by Amro
    Consider something like: cat file | command > file Is this good practice? Could this overwrite the input file as the same time as we are reading it, or is it always read first in memory then piped to second command? Obviously I can use temp files as intermediary step, but I'm just wondering.. t=$(mktemp) cat file | command > ${t} && mv ${t} file

    Read the article

  • Piping EOF problems with stdio and C++/Python

    - by yeus
    I got some problems with EOF and stdio. I have no idea what I am doing wrong. When I see an EOF in my program I clear the stdin and next round I try to read in a new line. The problem is: for some reason the getline function immediatly (from the second run always, the first works just as intended) returns an EOF instead of waiting for a new input from py python process... Any idea? alright Here is the code: #include <string> #include <iostream> #include <iomanip> #include <limits> using namespace std; int main(int argc, char **argv) { for (;;) { string buf; if (getline(cin,buf)) { if (buf=="q") break; /*****///do some stuff with input //my actual filter program cout<<buf; /*****/ } else { if ((cin.rdstate() & istream::eofbit)!=0)cout<<"eofbit"<<endl; if ((cin.rdstate() & istream::failbit)!=0)cout<<"failbit"<<endl; if ((cin.rdstate() & istream::badbit)!=0)cout<<"badbit"<<endl; if ((cin.rdstate() & istream::goodbit)!=0)cout<<"goodbit"<<endl; cin.clear(); cin.ignore(numeric_limits<streamsize>::max()); //break;//I am not using break, because I //want more input when the parent //process puts data into stdin; } } return 0; } and in python: from subprocess import Popen, PIPE import os from time import sleep proc=Popen(os.getcwd()+"/Pipingtest",stdout=PIPE,stdin=PIPE,stderr=PIPE); while(1): sleep(0.5) print proc.communicate("1 1 1") print "running"

    Read the article

  • Communicate multiple times with a process without breaking the pipe?

    - by Manux
    Hello, it's not the first time I'm having this problem and its really bugging me. Whenever I open a pipe using the Python subprocess module, I can only communicate with it once, as the documentation specifies: Read data from stdout and stderr, until end-of-file is reached proc = sub.Popen("psql -h darwin -d main_db".split(),stdin=sub.PIPE,stdout=sub.PIPE) print proc.communicate("select a,b,result from experiment_1412;\n")[0] print proc.communicate("select theta,zeta,result from experiment_2099\n")[0] The problem here is that the second time, Python isn't happy. Indeed, he decided to close the file after the first communicate: Traceback (most recent call last): File "a.py", line 30, in <module> print proc.communicate("select theta,zeta,result from experiment_2099\n")[0] File "/usr/lib64/python2.5/subprocess.py", line 667, in communicate return self._communicate(input) File "/usr/lib64/python2.5/subprocess.py", line 1124, in _communicate self.stdin.flush() ValueError: I/O operation on closed file So... multiple communications aren't allowed? I hope not ;) Please enlighten me.

    Read the article

  • Ruby: Read large data from stdout and stderr of an external process on Windows

    - by BinaryMuse
    Greetings, all, I need to run a potentially long-running process from Ruby on Windows and subsequently capture and parse the data from the external process's standard output and error. A large amount of data can be sent to each, but I am only necessarily interested in one line at a time (not capturing and storing the whole of the output). After a bit of research, I found that the Open3 class would take care of executing the process and giving me IO objects connected to the process's standard output and error (via popen3). Open3.popen3("external-program.bat") do |stdin, out, err, thread| # Step3.profit() ? end However, I'm not sure how to continually read from both streams without blocking the program. Since calling IO#readlines on out or err when a lot of data has been sent results in a memory allocation error, I'm trying to continuously check both streams for available input, but not having much luck with any of my implementations. Thanks in advance for any advice!

    Read the article

  • How to read piped input in Perl?

    - by Jenni
    I am trying to create something in Perl that is basically like the Unix "tee" command. I'm trying to read each line of STDIN, run a substitution on it, and print it. (And eventually, also print it to a file.) This works if I'm using console input, but if I try to pipe input to the command it doesn't do anything. Here's a simple example: print "about to loop\n"; while(<STDIN>) { s/2010/2009/; print; } print "done!\n"; I try to pipe the dir command to it like this: C:\perltestdir | mytee.pl about to loop done! Why is it not seeing the piped input?

    Read the article

  • gzip several files and pipe them into one input

    - by Daniel
    I have this program that takes one argument for the source file and then it parse it. I have several files gzipped that I would like to parse, but since it only takes one input, I'm wondering if there is a way to create one huge file using gzip and then pipe it into the only one input.

    Read the article

  • Start two processes and connect them with a pipe in Delphi

    - by Steve
    I need to launch two external programs in my program and connect the STDOUT of the first one to the STDIN of the second program. How can you achieve this in Delphi (RAD Studio 2009, if it matters)? I'm operating in Windows environment. As a commandline command my situation would look something like this: dumpdata.exe | encrypt.exe "mydata.dat"

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >