On Thursday, April 10, 2003, at 01:20 PM, Boris Popov wrote:
> We're evaluating postgresql as a backend choice for our next
> generation software and would like to perform some rough measurements
> in-house. Where could I get my hands on some reference data, say few
> very large tables with a total size of over 1G that we could run. I
> noticed earlier discussion about Tiger data, but 30G is a bit too much
> for what we need. Any other ideas or suggestions?
Actually Tiger is broken down into easily digestable chunks; you don't
grab all 30G at once. Pick one moderate size state to work with and
you've got about the right size data set.
In response to
pgsql-performance by date
|Next:||From: Patrick Hatcher||Date: 2003-04-10 18:52:45|
|Subject: Re: Slow Visual Basic application|
|Previous:||From: Cecilia Alvarez||Date: 2003-04-10 17:32:49|
|Subject: Slow Visual Basic application|