Re: large object regression tests

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: Jeremy Drake <jeremyd(at)jdrake(dot)com>
Cc: PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: large object regression tests
Date: 2006-09-08 03:19:07
Message-ID: 2190.1157685547@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

Jeremy Drake <jeremyd(at)jdrake(dot)com> writes:
> I noticed when I was working on a patch quite a while back that there are
> no regression tests for large object support.

Yeah, this is bad :-(

> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested. Is
> there any precedent on where to grab such a large chunk of data from?

There's always plain old junk data, eg, repeat('xyzzy', 100000).
I doubt that Moby Dick would expose any unexpected bugs ...

> ... I find that it is necessary to stash certain values across
> statements (large object ids, large object 'handles'), and so far I am
> using a temporary table to store these. Is this reasonable, or is there a
> cleaner way to do that?

I think it's supposed to be possible to use psql variables for that;
if you can manage to test psql variables as well as large objects,
that'd be a double bonus.

regards, tom lane

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message Tom Lane 2006-09-08 03:28:24 Re: Fixed length data types issue
Previous Message Bruce Momjian 2006-09-08 02:53:12 Re: pgindent run coming