* Guillaume Smet (guillaume(dot)smet(at)gmail(dot)com) wrote:
> These servers are available 24/7 to PostgreSQL QA and won't be used
> for other purposes.
> Concerning the second point, I wonder if it's not worth it to have a
> very simple thing already reporting results as the development cycle
> for 8.4 has already started (perhaps several pgbench unit tests
> testing various type of queries with a daily tree). Thoughts?
It didn't occur to me before, but, if you've got a decent amount of disk
space and server time..
I'm almost done scripting up everything to load the TIGER/Line
Shapefiles from the US Census into PostgreSQL/PostGIS. Once it's done
and working I would be happy to provide it to whomever asks, and it
might be an interesting data set to load/query and look at benchmarks
with. There's alot of GIST index creation, as well as other indexes
like soundex(), and I'm planning to use partitioning of some sort for
the geocoder. We could, for example, come up with some set of arbitrary
addresses to geocode and see what the performance of that is.
It's just a thought, and it's a large/"real" data set to play with.
The data set is 22G compressed shapefiles/dbf files. Based on my
initial numers I think it'll grow to around 50G loaded into PostgreSQL
(I'll have better numbers later today). You can get the files from
here: http://ftp2.census.gov/geo/tiger/TIGER2007FE/ Or, if you run into
a problem with that, I can provide a pretty fast site to pull them from
as well (15Mb/s).
In response to
pgsql-hackers by date
|Next:||From: Albe Laurenz||Date: 2008-04-01 13:34:17|
|Subject: Improve shutdown during online backup|
|Previous:||From: Toru SHIMOGAKI||Date: 2008-04-01 13:17:11|
|Subject: Re: build multiple indexes in single table pass?|