I'm running into a problem where I need to make sure a set of servers all have identical perl modules. Up to now I've been doing things manually (badly) but things break if I forget to install a module or forget I've upgraded one.
Is there some way to treat one machine as the "master" and make sure all the rest have the same modules and the same versions as it?
Mounting?
clscott on 2003-08-21T13:39:44
Use whatever network files services your OS enables (like NFS,SMB).
Then one server can be the master and the others can mount them.
syncing CPAN modules
jbisbee on 2003-08-21T13:53:55
Right now we migrate code via CVS export and rsync by using
cvs rtag TAG MODULE1 MODULE2
. and then using psync (a perl wrapper around rsync that has a config file to do multiple rsyncs).
Anyway, I recently was battling this exact same problem when upgrading DBI and DBD::Sybase on our servers and I wanted to changes to go with the migration (I also wanted a way to roll back in case all HELL broke loose)
My solution was to make a new CVS module called CPAN and then install the CPAN modules into that directory:
perl -ICPAN Makefile.PL LIB=CPAN
make
make test
make install
I then went into the CPAN directory and did a cvs add on all the files that were installed (made sure I did
cvs add -kb for binary files to turn off keywords and tell CVS its binary).
This solution really rocks because now I can take a migration including new or upgraded CPAN modules I'm using and then I can forget about it (no more having to update 20 servers with the udpates modules).
One short coming I found that even though I did a perl -ICPAN Makefile.PL, I found that when I did the make test the tests didn't include the CPAN directory and this boggles me a bit when DBD::Sybase compiled against a newer version of DBI, but when the tests ran they didn't include the CPAN directory and everything failed :( (solved this by editing the 10 test to include my new foreign library path but I'm sure there is an easier way to do this)
Re:syncing CPAN modules
hfb on 2003-08-21T14:07:44
Have you tried using CPAN::Site?
try reading the CPAN FAQ :)
hfb on 2003-08-21T14:05:49
You could either use a Bundle or Autobundle which are slightly different since Autobundle will make a bundle of everything, including the core modules. Both of these are mentioned in reasonable detail on the FAQ. You can also use ExtUtils::Installed to generate a list of installed modules [ also in the FAQ ] on all the boxes, compare them, and then either generate a Bundle or use CPAN.pm to install. You can also use "perl -MCPAN -e 'CPAN::Shell->install(CPAN::Shell->r)'" to update everything automatically with CPAN.pm. And, should you be using Solaris, you can use mkpkg to generate solaris packages which are awfully handy in such situations. I wouldn't recommend NFS for installations as the second your network goes tits up you're screwed, especially if you have perl boot scripts.
:)
Re:try reading the CPAN FAQ :)
gav on 2003-08-21T18:02:31
Thanks for the tip. ExtUtils::Installed works beautifully. I think the trick will be to create bundles and push them to the other servers, the only problem seems to be modules that have interactive installers.
Fotango::Build
james on 2003-08-21T16:52:48
We have the same issue at Fotango. Essentially we've come to the decision that if something is used by our source, then we need to version control it. The net result has been doing vendor imports of huge amounts of source code (ie, apache, mod_perl, perl, all CPAN modules, etc) into our version control system.
We are also putting together a build system so that our systems all build into one app directory with its own bin, lib and include dirs.
The individual package build instructions live on our intranet (which happens to be a Kwiki) and the build results get reported back to another page on the Kwiki. Basically the plan is to do nightly smokes that we can see relatively easily...
The build instructions themselves need to be multiplatform, as we build to three different OS's. In order to handle that we've come up with a multi-platform build syntax for Kwiki that can be commented on^W^Wranted about alongside the build instructions themselves.
As it turns out we need to manage well over 2 million lines of code, which has surprised all of us here....
Its all very useful and interesting, but bootstrapping it is taking a _long_ time. Once we are done however, we'll have an Kwiki-based automated build system that allows our development team to be responsible for software quality, rather than the sysadmins. Of course this will be all open-sourced in several weeks...
Another option ...
rob_au on 2003-08-22T04:13:09
Another option which may be worth investigating depending upon the network environment would be the use of PAR archives.
The latest version of PAR (0.74) is fairly stable and incorporates the facility to specify PAR archives by remote URLs - For example, based on that in Autrijus Tang's TPF Grant 2003 Report:
BEGIN
{
use PAR;
use lib 'http://localhost/DBI-1.37-i386-freebsd-5.8.0.par';
}
use DBI;
print DBI->VERSION;
(Naturally, the filename of the PAR archive could be changed to something more generic for access by remote network hosts).
In addition to this, work which Autrijus performed under the scope of this TPF Grant added the ability to sign, verify, install and uninstall PAR components which would be important in order to ensure the validity and integrity of archives retrieved from remote locations. The report of this work has been linked from use.perl.org previously here).