From: | Magnus Hagander <magnus(at)hagander(dot)net> |
---|---|
To: | Andrew Dunstan <andrew(at)dunslane(dot)net> |
Cc: | Bruce Momjian <bruce(at)momjian(dot)us>, Peter Eisentraut <peter_e(at)gmx(dot)net>, Dave Page <dpage(at)pgadmin(dot)org>, PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: marking old branches as no longer maintained |
Date: | 2011-07-12 14:20:59 |
Message-ID: | CABUevExA=-xY0cTcH7tQBbtDKKdZQHs=On3hOqNf771EK6SgRw@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On Tue, Jul 12, 2011 at 01:10, Andrew Dunstan <andrew(at)dunslane(dot)net> wrote:
>
>
> On 07/11/2011 07:59 PM, Bruce Momjian wrote:
>>
>> Andrew Dunstan wrote:
>>>
>>> On 06/28/2011 05:31 PM, Peter Eisentraut wrote:
>>>>
>>>> On tis, 2011-06-28 at 17:05 -0400, Andrew Dunstan wrote:
>>>>>>
>>>>>> Couldn't you just put a text file on the build farm server with
>>>>>> recommended branches?
>>>>>
>>>>> As I told Magnus, that gets ugly because of limitations in MinGW's SDK
>>>>> perl. I suppose I could just not implement the feature for MinGW, but
>>>>> I've tried damn hard not to make those sorts of compromises and I'm not
>>>>> keen to start.
>>>>
>>>> The buildfarm code can upload the build result via HTTP; why can't it
>>>> download a file via HTTP?
>>>
>>> It has to use a separate script to do that. I don't really want to add
>>> another one just for this.
>>>
>>> (thinks a bit) I suppose I can make it do:
>>>
>>> my $url = "http://buildfarm.postgresql.org/branches_of_interest.txt";
>>> my $branches_of_interest = `perl -MLWP::Simple -e
>>> "getprint(q{$url})"`;
>>>
>>> Maybe that's the best option. It's certainly going to be less code than
>>> anything else :-)
>>
>> Could you pull the list of active branches from our web site HTML?
>>
>
> I can, but I'm not that keen on having to do web scraping. Currently my test
> machine (crake) is using the above scheme and it's working fine. It's not a
> huge burden to maintain, after all.
You don't actually need to resort to web scraping - it's available as
well formatted xml (http://www.postgresql.org/versions.rss)
That said, I agree that it's not a huge burden, and probably a better
idea, to do it your current way.
--
Magnus Hagander
Me: http://www.hagander.net/
Work: http://www.redpill-linpro.com/
From | Date | Subject | |
---|---|---|---|
Next Message | Alvaro Herrera | 2011-07-12 14:23:16 | Re: TODO list updated |
Previous Message | Bruce Momjian | 2011-07-12 14:13:42 | Re: [COMMITTERS] pgsql: Enable CHECK constraints to be declared NOT VALID |