Backup/restore of pg_statistics

From: Joel Jacobson <joel(at)jacobson(dot)be>
To: pgsql-performance(at)postgresql(dot)org
Subject: Backup/restore of pg_statistics
Date: 2003-11-17 14:51:20
Message-ID: 1069080680.3fb8e068eb941@mail.jacobson.be
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

Hi,

I understand that it is not possible to occasionally re-plan the queries in a
PL/pgSQL function without dropping and re-creating the function.

I think it would be useful if the queries in a PL/pgSQL function could be
re-planned on-the-fly.

When a lot of data has been added/modified and ANALYZE is suitable to run, it
would also be a great idea to re-plan the queries used in PL/pgSQL functions.
I understand that this is not possible?
The only way would be to DROP/CREATE the functions or to use EXECUTE.
I don't think EXECUTE is an option, because preparing the queries every time the
function is called is in my case not necessary and just a waste of
performance.

As a work-around, I am forced to,
1. populate the database with a lot of test data,
2. run ANALYZE,
3. and finally, create the PL/pgSQL functions
The prepared queries in the functions will now be sufficiently optimized.

I don't think this is a nice solution.

I also thought of a slightly better solution, but I don't know if it is
possible.
My idea was to populate the database once and then save the data in
pg_statistics generated by ANALYZE to a file. Every time the database needs to
be created, the statistics could then be restored thus making the planner
produce "future-optimized" queries when the PL/pgSQL functions are created,
even though the database is empty.

I would greatly appreciate any help/comments.

Thank you.

Joel Jacobson <joel(at)jacobson(dot)be>

Responses

Browse pgsql-performance by date

  From Date Subject
Next Message Rich Cullingford 2003-11-17 16:38:37 Top n queries and GROUP BY
Previous Message Andrew Sullivan 2003-11-17 14:42:43 Re: INSERT extremely slow with large data sets (fwd)