I was trying to find some way to implement multithreading into my postgreSQL stored functions.
The thing is, that I have data stored in multiple tables - for each day one table - and I want to write
a function which selects data from these tables and stores them into files (or just returns the data in cursor);
Information about data stored in these tables are in another table defined like this:
partitions_daily(from_date timestamp, to_date timestamp, table_name)
From_date and to_date tells in which table the requested data should be found.
In this time I have function through which I can get the data, the problem is that everything is processed with only
one CPU core. I think it should be very easy to make this model to be multithreaded and use more CPU cores.
I was trying to implement this with plperlU but there is problem with using spi in multiple threads. There is also
possibility to do multithreading with some external scripts to get the data, but this is no exactly what I want.
Can someone please help me with this? How can I force postgreSQL to use multiple CPU cores to run my function.
If somebody can give me an advice or post some simple peace of code It would be great.
Thank you for all replies,
pgsql-general by date
|Next:||From: Andreas Kretschmer||Date: 2008-03-30 15:36:52|
|Subject: Re: postgreSQL multithreading|
|Previous:||From: Tino Wildenhain||Date: 2008-03-30 10:59:05|
|Subject: Re: Survey: renaming/removing script binaries (createdb,