Search Results

Search found 1799 results on 72 pages for 'yahoo pipes'.

Page 15/72 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • Trap SIGPIPE when trying to write without reader

    - by Matt
    I am trying to implement a named-pipe communication solution in BASH between two processes. The first process runs a script which echo something in a named-pipe: send(){ echo 'something' > $NAMEDPIPE } And the second script is supposed to read the named-pipe via another script which contains: while true;do if read line < $NAMEDPIPE;do someCommands fi done Not that the named pipe has been previously created using the traditional command mkfifo $NAMEDPIPE My problem is that the reader script is not always running so that if the writer script try to write in the named-pipe it stay blocked until a reader connect the pipe. I want to avoid this behavior, and a solution would be to trap a SIGPIPE signal. Indeed, according to man 7 signal is supposed to be send when trying to write in a pipe with no reader. So I changed my red function by: read(){ trap 'echo "SIGPIPE received"' SIGPIPE echo 'something' > $NAMEDPIPE } But when I run the reader script, the script stay blocked, and not "SIGPIPE received" appears... Am I mistaking on the signal mechanism or is there any better solution to my problem ? Thank you for your help.

    Read the article

  • C check before writing to closed pipe

    - by Gary
    Is there an easy way to check if a pipe is closed before writing to it in C? I have a child and parent process, and the parent has a pipe to write to the child. However, if the child closes the pipe and the parent tries to read - I get a broken pipe error. So how can I check to make sure I can write to the pipe, so I can handle it as an error if I can't? Thanks!

    Read the article

  • linux piping ( convert -> pdf2ps -> lp)

    - by Bor
    Ok, so I can print a pdf doing: pdf2ps file.pdf - | lp -s But now I want to use convert to merge several pdf files, I can do this with: convert file1.pdf file2.pdf merged.pdf which merges file1.pdf and file2.pdf into merged.pdf, target can be replaced with '-'. Question How could I pipe convert into pdf2ps and then into lp though?

    Read the article

  • Is there any tutorial on connecting .NET 4 Pipeline with Pipeline from some C\C++ programm?

    - by Ole Jak
    Is there any tutorial on connecting .NET 4 Pipeline with Pipeline from some C\C++ programm? For example how to get data from VLC Pipeline output ... I mean VLC docs say that there command line args can eat name of pipe like YUV video output --yuv-file=<string> How to give such name to pipe created in your .Net programm so to be able to give it to other programms or resive from other programms?

    Read the article

  • Supporting Piping (A Useful Hello World)

    - by blastthisinferno
    I am trying to write a collection of simple C++ programs that follow the basic Unix philosophy by: Make each program do one thing well. Expect the output of every program to become the input to another, as yet unknown, program. I'm having an issue trying to get the output of one to be the input of the other, and getting the output of one be the input of a separate instance of itself. Very briefly, I have a program add which takes arguments and spits out the summation. I want to be able to pipe the output to another add instance. ./add 1 2 | ./add 3 4 That should yield 6 but currently yields 10. I've encountered two problems: The cin waits for user input from the console. I don't want this, and haven't been able to find a simple example showing a the use of standard input stream without querying the user in the console. If someone knows of an example please let me know. I can't figure out how to use standard input while supporting piping. Currently, it appears it does not work. If I issue the command ./add 1 2 | ./add 3 4 it results in 7. The relevant code is below: add.cpp snippet // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing if (numbers.size() > 0) { double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } std::cout << sum << std::endl; } else { double input; // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> input; std::cout << input << std::endl; } } // ... MORE IRRELEVANT CODE ... So, I guess my question(s) is does anyone see what is incorrect with this code in order to support piping standard input? Are there some well known (or hidden) resources that explain clearly how to implement an example application supporting the basic Unix philosophy? @Chris Lutz I've changed the code to what's below. The problem where cin still waits for user input on the console, and doesn't just take from the standard input passed from the pipe. Am I missing something trivial for handling this? I haven't tried Greg Hewgill's answer yet, but don't see how that would help since the issue is still with cin. // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> arg; std::cout << arg << std::endl; } std::cout << sum << std::endl; // ... MORE IRRELEVANT CODE ...

    Read the article

  • How do you pipe output from a Ruby script to 'head' without getting a broken pipe error

    - by dan
    I have a simple Ruby script that looks like this require 'csv' while line = STDIN.gets array = CSV.parse_line(line) puts array[2] end But when I try using this script in a Unix pipeline like this, I get 10 lines of output, followed by an error: ruby lib/myscript.rb < data.csv | head 12080450 12080451 12080517 12081046 12081048 12081050 12081051 12081052 12081054 lib/myscript.rb:4:in `write': Broken pipe - <STDOUT> (Errno::EPIPE) Is there a way to write the Ruby script in a way that prevents the broken pipe exception from being raised?

    Read the article

  • UNIX FIFO: How to allow only one writer/reader pair to use a FIFO?

    - by Max Krug
    Hi! I've written two programs: the first, the "writer", creates a FIFO and writes data into it. The second one, the "reader" runs in background and looks for data in the FIFO. Once data is there, the reader reads it out. If I start e.g. two writers and two readers, they all can write/read into/from the same FIFO. How can I restrict it for 3rd and 4th readers/writers to use the FIFO and allow only one writer and one reader to use the FIFO? thanks a lot. -- kind regards, Max Krug

    Read the article

  • How do I redirect stdin/stdout when I have a sequence of commands in Bash?

    - by Tom
    I've currently got a Bash command being executed (via Python's subprocess::Popen) which is reading from stdin, doing something and outputing to stdout. Something along the lines of: pid = subprocess.Popen( ["-c", "cmd1 | cmd2"], stdin = subprocess.PIPE, stdout = subprocess.PIPE, shell =True ) output_data = pid.communicate( "input data\n" ) Now, what I want to do is to change that to execute another command in that same subshell that will alter the state before the next commands execute, so my shell command line will now (conceptually) be: cmd0; cmd1 | cmd2 Is there any way to have the input sent to cmd1 instead of cmd0 in this scenario? I'm assuming the output will include cmd0's output (which will be empty) followed by cmd2's output. cmd0 shouldn't actually read anything from stdin, does that make a difference in this situation? I know this is probably just a dumb way of doing this, I'm trying to patch in cmd0 without altering the other code too significantly. That said, I'm open to suggestions if there's a much cleaner way to approach this.

    Read the article

  • Bash: Is it ok to use same input file as output of a piped command?

    - by Amro
    Consider something like: cat file | command > file Is this good practice? Could this overwrite the input file as the same time as we are reading it, or is it always read first in memory then piped to second command? Obviously I can use temp files as intermediary step, but I'm just wondering.. t=$(mktemp) cat file | command > ${t} && mv ${t} file

    Read the article

  • Piping EOF problems with stdio and C++/Python

    - by yeus
    I got some problems with EOF and stdio. I have no idea what I am doing wrong. When I see an EOF in my program I clear the stdin and next round I try to read in a new line. The problem is: for some reason the getline function immediatly (from the second run always, the first works just as intended) returns an EOF instead of waiting for a new input from py python process... Any idea? alright Here is the code: #include <string> #include <iostream> #include <iomanip> #include <limits> using namespace std; int main(int argc, char **argv) { for (;;) { string buf; if (getline(cin,buf)) { if (buf=="q") break; /*****///do some stuff with input //my actual filter program cout<<buf; /*****/ } else { if ((cin.rdstate() & istream::eofbit)!=0)cout<<"eofbit"<<endl; if ((cin.rdstate() & istream::failbit)!=0)cout<<"failbit"<<endl; if ((cin.rdstate() & istream::badbit)!=0)cout<<"badbit"<<endl; if ((cin.rdstate() & istream::goodbit)!=0)cout<<"goodbit"<<endl; cin.clear(); cin.ignore(numeric_limits<streamsize>::max()); //break;//I am not using break, because I //want more input when the parent //process puts data into stdin; } } return 0; } and in python: from subprocess import Popen, PIPE import os from time import sleep proc=Popen(os.getcwd()+"/Pipingtest",stdout=PIPE,stdin=PIPE,stderr=PIPE); while(1): sleep(0.5) print proc.communicate("1 1 1") print "running"

    Read the article

  • Communicate multiple times with a process without breaking the pipe?

    - by Manux
    Hello, it's not the first time I'm having this problem and its really bugging me. Whenever I open a pipe using the Python subprocess module, I can only communicate with it once, as the documentation specifies: Read data from stdout and stderr, until end-of-file is reached proc = sub.Popen("psql -h darwin -d main_db".split(),stdin=sub.PIPE,stdout=sub.PIPE) print proc.communicate("select a,b,result from experiment_1412;\n")[0] print proc.communicate("select theta,zeta,result from experiment_2099\n")[0] The problem here is that the second time, Python isn't happy. Indeed, he decided to close the file after the first communicate: Traceback (most recent call last): File "a.py", line 30, in <module> print proc.communicate("select theta,zeta,result from experiment_2099\n")[0] File "/usr/lib64/python2.5/subprocess.py", line 667, in communicate return self._communicate(input) File "/usr/lib64/python2.5/subprocess.py", line 1124, in _communicate self.stdin.flush() ValueError: I/O operation on closed file So... multiple communications aren't allowed? I hope not ;) Please enlighten me.

    Read the article

  • Ruby: Read large data from stdout and stderr of an external process on Windows

    - by BinaryMuse
    Greetings, all, I need to run a potentially long-running process from Ruby on Windows and subsequently capture and parse the data from the external process's standard output and error. A large amount of data can be sent to each, but I am only necessarily interested in one line at a time (not capturing and storing the whole of the output). After a bit of research, I found that the Open3 class would take care of executing the process and giving me IO objects connected to the process's standard output and error (via popen3). Open3.popen3("external-program.bat") do |stdin, out, err, thread| # Step3.profit() ? end However, I'm not sure how to continually read from both streams without blocking the program. Since calling IO#readlines on out or err when a lot of data has been sent results in a memory allocation error, I'm trying to continuously check both streams for available input, but not having much luck with any of my implementations. Thanks in advance for any advice!

    Read the article

  • How to read piped input in Perl?

    - by Jenni
    I am trying to create something in Perl that is basically like the Unix "tee" command. I'm trying to read each line of STDIN, run a substitution on it, and print it. (And eventually, also print it to a file.) This works if I'm using console input, but if I try to pipe input to the command it doesn't do anything. Here's a simple example: print "about to loop\n"; while(<STDIN>) { s/2010/2009/; print; } print "done!\n"; I try to pipe the dir command to it like this: C:\perltestdir | mytee.pl about to loop done! Why is it not seeing the piped input?

    Read the article

  • gzip several files and pipe them into one input

    - by Daniel
    I have this program that takes one argument for the source file and then it parse it. I have several files gzipped that I would like to parse, but since it only takes one input, I'm wondering if there is a way to create one huge file using gzip and then pipe it into the only one input.

    Read the article

  • How to create named pipe acsessible Only on your machin? (VS08 C++)

    - by Ole Jak
    Hello, I have created a program that write video stream to a named pipe on windows, using Visual Studio C++ 2008 . how to be sequre that no one exept programms on this computer can acsess this pipe? npipe = CreateNamedPipe("\\\\.\\pipe\\TestChannel", PIPE_ACCESS_DUPLEX, PIPE_TYPE_MESSAGE | PIPE_WAIT, PIPE_UNLIMITED_INSTANCES , 1024, 1024,5000,NULL);

    Read the article

  • Start two processes and connect them with a pipe in Delphi

    - by Steve
    I need to launch two external programs in my program and connect the STDOUT of the first one to the STDIN of the second program. How can you achieve this in Delphi (RAD Studio 2009, if it matters)? I'm operating in Windows environment. As a commandline command my situation would look something like this: dumpdata.exe | encrypt.exe "mydata.dat"

    Read the article

  • piping to variables

    - by lego69
    cut -d" " -f2 ${2} | $callsTo hello, can somebody please explain can I pipe the result of cut to variable callsTo, and how will it be stored, as the string or list?

    Read the article

  • Best way to convert Stream (of unknown length) to byte array, in .NET?

    - by Frank Hamming
    Hello, I have the following code to read data from a Stream (in this case, from a named pipe) and into a byte array: // NPSS is an instance of NamedPipeServerStream int BytesRead; byte[] StreamBuffer = new byte[BUFFER_SIZE]; // defined elsewhere (less than total possible message size, though) MemoryStream MessageStream = new MemoryStream(); do { BytesRead = NPSS.Read(StreamBuffer, 0, StreamBuffer.Length); MessageStream.Write(StreamBuffer, 0, BytesRead); } while (!NPSS.IsMessageComplete); byte[] Message = MessageStream.ToArray(); // final data Could you please take a look and let me know if it can be done more efficiently or neatly? Seems a bit messy as it is, using a MemoryStream. Thanks!

    Read the article

  • Why do my websites have a first page rank on Bing and Yahoo but not Google? [closed]

    - by Linda Cullum
    I have 3 websites suffering from a drop in ranking with Google and hence a huge drop in traffic. The instant drop ocurred in September and I have not been able to remedy it. For the past 6-10 years my main website http://LearnToSail.Net has ranked from #3 to #1 on the 1st page of Google and all the other engines with the search term "learn to sail" Now it shows on the 1st page of Bing and Yahoo but does not show up on ANY pages of Google. The only way it does come up is if I add "cd" to the "learn to sail" phrase. We sell a sailing cd on that website. The other websites are http://LearnToSailOnLine.com ..search terms are "learn to sail online" or learntosailonline and historyofthepilgrims.com search terms are "history of the pilgrims" "historyofthepilgrims" I get the same result. Gone on Google but 1st pages for Bing and Yahoo. I have researched, edited,updated blogs, made sitemaps, prayed to the universe and use Google Webmaster tools but nothing is changing and I have lost alot of business. I host with 1and1.com and have been back and forth with them but to no avail and no change in traffic. I thought maybe some DNS mapping was off. I used to have alot of traffic now I have hardly any. Any advice would be greatly appreciated. I am still in the process of working on the issue of course! This is a really great website here and I am glad I came across it. Thank you, LS Cullum Little Pines Multimedia

    Read the article

  • OpenID on Google not returning anything

    - by PlayKid
    Hi there, For some reason, the following code does not return anything: string alias = response.FriendlyIdentifierForDisplay; var sreg = response.GetExtension<ClaimsResponse>(); if (sreg != null && sreg.MailAddress != null) { alias = sreg.MailAddress.User; } if (sreg != null && !string.IsNullOrEmpty(sreg.Email)) { alias = sreg.Email; } if (sreg != null && !string.IsNullOrEmpty(sreg.FullName)) { alias = sreg.FullName; } I was hoping I can get the Email from Yahoo or Google, but sreg just return null whichever provider I have chosen. I saw some of other posts that this code should return an e-mail at least, but for me, it does not, please assist. Thanks alot

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >