I would like to run several commands, and capture all output to a logfile. I also want to print any errors to
the screen (or optionally mail
the output to someone).
Here's an example.
The following command will run three commands, and will write all output (STDOUT and STDERR) into a single logfile.
{ command1 && command2 && command3 ; } > logfile.log 2>&1
Here is what I want to do with
the output of these commands:
STDERR and STDOUT for all commands goes to a logfile, in case I need it later--- I usually won't look in here unless there are problems.
Print STDERR to
the screen (or optionally, pipe to /bin/mail), so that any error stands out and doesn't get ignored.
It would be nice if
the return codes were still usable, so that I could do some error handling. Maybe I want to send email if there was an error, like this:
{ command1 && command2 && command3 ; } logfile.log 2&1 || mailx -s "There was an error"
[email protected]
The problem I run into is that STDERR loses context during I/O redirection. A '2&1' will convert STDERR into STDOUT, and therefore I cannot view errors if I do 2 error.log
Here are a couple juicier examples. Let's pretend that I am running some familiar build commands, but I don't want
the entire build to stop just because of one error so I use
the '--keep-going' flag.
{ ./configure && make --keep-going && make install ; } > build.log 2>&1
Or, here's a simple (And perhaps sloppy) build and deploy script, which will keep going in
the event of an error.
{ ./configure && make --keep-going && make install && rsync -av --keep-going /foo devhost:/foo} > build-and-deploy.log 2>&1
I think what I want involves some sort of Bash I/O Redirection, but I can't figure this out.