So, I've got some code that uses Net::Ftp.pm.
Now and again, I have to transfer really really big files. The whole system is automated, but it hangs and dies on a call to $ftp->put($filename).
So, the app dies, the sysadmin re-starts it, and it starts over again, transmitting the same file, hangs and dies. Never gets to the rest of the files, some of whom go to FTP sites that actually, you know, work.
I'm thinking of refactoring the code so that it just reports error instead of die-ing. Then it can process the ones that might work, and then come back and try again with the big file that only works rarely.
To do so, I'm planning on wrapping an eval block around the offensive code, and checking $@ to see if the call 'dies.'
Does this sound like a good approach?
regards,
There's no rule that says you have to stop the program if one file transfer fails, so yeah, that's how you can do it. Remember to includeTo do so, I'm planning on wrapping an eval block around the offensive code, and checking $@ to see if the call 'dies.'Does this sound like a good approach?
$ftp->message
in the error message. Does a command line FTP of the same file to the same site work?