Pipeline For Downloading and Processing Files In Unix/Linux Environment With Perl
- by neversaint
I have a list of files URLS where I want to download them:
http://somedomain.com/foo1.gz
http://somedomain.com/foo2.gz
http://somedomain.com/foo3.gz
What I want to do is the following for each file:
Download foo1,2.. in parallel with wget and nohup.
Every time it complete download process them with myscript.sh
What I have is this:
#! /usr/bin/perl
@files = glob("foo*.gz");
foreach $file (@files) {
my $downurls = "http://somedomain.com/".$file;
system("nohup wget $file &");
system("./myscript.sh $file >> output.txt");
}
The problem is that I can't tell the above pipeline when does the file finish downloading.
So now it myscript.sh doesn't get executed properly.
What's the right way to achieve this?