PG Build Farm Status

From: Andrew Dunstan <andrew(at)dunslane(dot)net>
To: PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org>
Subject: PG Build Farm Status
Date: 2004-09-24 21:05:54
Message-ID: 41548C32.9000809@dunslane.net
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers


Shown below is a HOWTO for PostgreSQL build farm clients for the system
I'm working on. The HOWTO is also available at
http://pgfoundry.org/docman/view.php/1000040/4/PGBuildFarm-HOWTO.txt

The code is running successfully on several machines, and uploading
results to my test server.

A production server for collecting the results from the distributed
clients is going to be provided by CommandPrompt (thanks, Joshua!), and
should be available in a couple of weeks. Meanwhile, as shown below,
people who want to try before they buy can test out the client-side
without any uploading.

Comments are welcome, either to me directly or preferably on the
trackers and forums at http://pgfoundry.org/projects/pgbuildfarm/

cheers

andrew

=====================================================================

This HOWTO is for PostgreSQL Build Farm clients.

0. PostgreSQL build Farm is a distributed build system designed to detect
build failures on a large collection of platforms and configurations.
This software is written in Perl. If you're not comfortable with Perl
then you possibly don't want to run this, even though the only adjustment
you should ever need is to the config file (which is also Perl).

1. Get the Software, from:
http://pgfoundry.org/download.php/66/build-farm-1_0.tgz
Unpack it and put it somewhere. You can put the config file in a different
place from the run_build.pl script if you want to (see later), but the
simplest thing is to put it in the same place.

2. Create a directory where builds will run. This should be dedicated to
the use of the build farm. Make sure there's plenty of space - on my
machine each branch can use up to about 700Mb during a build.

3. Edit the config file and put the location of the directory you just
created in the config variable "build_root". Adjust the config variables
"make", "config_opts" and (if you don't use ccache) "config_env"
to suit your environment, and to choose which optional postgres modules
you want to build. You should not need to adjust any other variables.
Check that you didn't screw things up by running "perl -cw build-farm.conf".

4. If the path to your perl installation isn't "/usr/bin/perl", edit
the #! line in run_build.pl so it is correct. This is the ONLY line in that
file you should ever need to edit.

5. run "perl -cw run_build.pl". If you get errors about missing perl modules
you will need to install them. Specifically, you will need these modules:
LWP
HTTP::Request::Common
MIME::Base64
Digest::SHA1
Fcntl
Getopt::Long
File::Find
Many of these you should have. They are all standard CPAN modules. When you
don't get an error any more you are ready to start testing.

6. With a PATH that matches what you will have when running from cron, run
the script in no-send, no-status, verbose mode. Something like this:
PATH=/usr/bin:/bin ./run_build.pl --nosend --nostatus --verbose
and watch the fun begin. If this results in failures because it can't
find some executables (especially gmake and cvs), you might need to change
the config file again, this time changing the "build_env" with another
setting something like:
PATH => "/usr/local/bin:$ENV{PATH}",
Also, if you put the config file somewhere else, you will need to use
the --config=/path/to/build-farm.conf option.

7. When you have that running, it's time to try with cron. Put a line in your
crontab that looks something like this:
43 * * * * cd /location/of/run_build.pl/ && ./run_build.pl --nosend --verbose
Again, add the --config option if needed. Notice that this time we didn't
specify nostatus. That means that (after the first run) the script won't
do any build work unless the CVS repo has changed. Check that your cron
job runs (it should email you the results, unless you tell it to send them
elsewhere).

8. By default run_build.pl builds the HEAD branch from CVS. If you want to
build other branches, you can do so by specifying the name on the commandline,
e.g.
run_build.pl REL7_4_STABLE
so, once you have HEAD working, remove the --verbose flag from your crontab,
and add extra cron lines for each branch you want to build regularly.
My crontab (well, one of them) looks something like this:
6 * * * * cd /home/andrew/buildfarm && ./run_build.pl --nosend
30 4 * * * cd /home/andrew/buildfarm && ./run_build.pl --nosend REL7_4_STABLE

9. Once this is all running happily, you can register to upload your
results to the central server. After that you will edit 3 lines in your
config file, remove the --nosend flags, and you are done. We'll cover
registration in detail when the central server is set up.

10. Resource use. Using the 'update' cvs method (see the config file) results
in significantly lower bandwidth use on both your server and the main
postgresql cvs server than using method 'export'. The price is that
occasionally cvs update is less reliable, and you have a slightly higher
disk usage (about 70Mb more for HEAD branch). Eventually I'd like to
migrate the load entirely off the postgresql cvs server by implementing an
'rsync' method. But that's for another day. When you use the 'update'
method, run_build.pl works on a temporary copy of the repo,
never inside the repo (hence the extra disk usage). Use of ccache is
highly recommended, especially on slow machines - on my two machines
runs that took an hour without ccache take about 15 minutes with
ccache, and with significantly lower processor load. Finally, bandwidth
use from uploading results should be very light, except when there are
failures, in which case the transaction can be quite large. But it's only
one transaction, and we don't expect lots of failures, do we? ;-)

11. Please file bug reports on the tracker at:
http://pgfoundry.org/tracker/?atid=238&group_id=1000040&func=browse

Browse pgsql-hackers by date

  From Date Subject
Next Message Tom Lane 2004-09-24 21:12:50 Re: 7.4.5 losing committed transactions
Previous Message Joe Conway 2004-09-24 20:59:28 Re: CRITICAL HELP NEEDED! DEAD DB!