Pipeline For Downloading and Processing Files In Unix/Linux Environment With Perl

Posted by neversaint on Stack Overflow See other posts from Stack Overflow or by neversaint
Published on 2010-04-16T06:00:29Z Indexed on 2010/04/16 6:03 UTC
Read the original article Hit count: 325

Filed under:
|
|
|
|

I have a list of files URLS where I want to download them:

http://somedomain.com/foo1.gz
http://somedomain.com/foo2.gz
http://somedomain.com/foo3.gz

What I want to do is the following for each file:

  1. Download foo1,2.. in parallel with wget and nohup.
  2. Every time it complete download process them with myscript.sh

What I have is this:

#! /usr/bin/perl

@files = glob("foo*.gz");

foreach $file (@files) {
   my $downurls = "http://somedomain.com/".$file;
   system("nohup wget $file &");
   system("./myscript.sh $file >> output.txt");
}

The problem is that I can't tell the above pipeline when does the file finish downloading. So now it myscript.sh doesn't get executed properly.

What's the right way to achieve this?

© Stack Overflow or respective owner

Related posts about linux

Related posts about unix