I've pretty much completed the scheduler I have been developing - I just need to polish the web gui and integrate log4perl properly.
Anyway - one of the things it does is run commands either as perl via do() or via system().
It captures stderr/stdout through copying the current filehandles for each, and reopening them to files, reopening the copied filehandles again after running a command. That bit is fairly simple, but allows me to slurp in the stderr output after running the command or if the whole process running the command dies unexpectedly. In a scheduler with dozens of jobs to run, many of them in development this is incredibly useful:
###
open(OLD_STDERR,">&STDERR") or warn "Failed to save STDERR";
open(STDERR,">$stderr_filename") or warn "Failed to redirect STDERR";
open(OLD_STDOUT,">&STDOUT") or warn "Failed to save STDOUT";
open(STDOUT,">$stdout_filename") or warn "Failed to redirect STDOUT";
print "$$\n"; # don't want an empty file
warn "$$\n"; # don't want an empty file
# do stuff ...
open(STDOUT,">&OLD_STDOUT") or warn "Failed to restore STDOUT";
open(STDERR,">&OLD_STDERR") or warn "Failed to restore STDERR";
###
The tricky bit this morning was handling scripts that 'exit' when executed within do. I want to capture that exit and continue, but any other exits should happen normally.
The only solution that actually worked was to copy a glob of CORE::GLOBAL::exit to somewhere, set the glob to sub that warns/logs the exit, with time and pid, before continuing, and then copy the original exit glob back :
###
eval {
*real_exit = *CORE::GLOBAL::exit;
*CORE::GLOBAL::exit = sub { warn localtime() ." pid : $$ attempted to exit! caller : ", join(', ',caller()),"\n"; };
$ok = do "$filename";
$error = "couldn't parse $filename: $@" if $@;
$error = "couldn't do $filename: $!" unless defined $ok;
$error = "couldn't run $filename" unless $ok;
*CORE::GLOBAL::exit = *real_exit;
};
###
Getting it working was helped immensely by Nik and Davorg on #london.pm following Nik's talk at the last tech meet where he demonstrated how to over-ride stuff in a script that is being run.
eval {
local *CORE::GLOBAL::exit = sub {
warn localtime()." pid : $$ attempted to exit! caller : ", join(', ',caller()),"\n";
};
$ok = do "$filename";
$error = "couldn't parse $filename: $@" if $@;
$error = "couldn't do $filename: $!" unless defined $ok;
$error = "couldn't run $filename" unless $ok;
};
-Dom
Re:local()
TeeJay on 2006-03-15T20:08:06
yes I tried it - despite having being in an anon block inside an eval it didn't descope for some reason - possibly I fluffed it, but exits ended up getting caught in unexpected places elsewhere in the program.Re:local()
broquaint on 2006-03-16T11:46:38
Dom's code won't work because core subroutines can only be overridden at compile-time. Also you don't need to store a copy of *CORE::GLOBAL::exit as it can always be found in CORE::exit (although you can't take a reference to it so just use a wee closure e.g*CORE::GLOBAL::exit = sub { CORE::exit $_[0] };
).Re:local()
broquaint on 2006-03-16T11:51:49
I'm an idiot. Dom's code should work since the code in the the context of the do will be evaluated at compile-time.Re:local()
Dom2 on 2006-03-16T19:19:48
Oooh, crap, I didn't know that.:-( Thanks for pointing it out. -Dom
Re:system calls with filehandle magic...
TeeJay on 2006-03-16T09:43:57
It kind of evolved, I'd be tempted to replace it with IPC::Run but only if it others exactly the same stuff without adding complexity.
Although, to be honest - it's working very nicely now