We, the log4j developers, are working on an appender
that will write
log4j events to various RDBMS Various fields of an
event object will
go to 3 different tables, namely the logging_event
logging_event_property table and the
table. Each entry in the logging_event_property and
logging_event_exception tables uses a reference to
event_id is a database generated primary key in the
For entries in the logging_event_property and
tables to be meaningful, we absolutely need the
each time we insert a new logging_event row. We are
able to do this by
following each logging_event insertion with a query to
asking for the event_id of the last inserted event.
This works for
multiple database systems.
However, we have discovered that batching multiple
insertions gives a
very significant boost in performance. Thus, we would
like to insert
say 50 logging_event rows, then multiple
and then multiple logging_event_exception rows. We are
JDBC getGeneratedKeys method to obtain the event_ids.
function is not implemented in Postgresql.
Looking at the archives it seems that this
support for the database back end which does not exist
Is there another way to insert into multiple tables in
described above without using the JDBC
Your views on the matter would be highly appreciated.
For log4j documentation consider "The complete
Do you Yahoo!?
Friends. Fun. Try the all-new Yahoo! Messenger.
pgsql-jdbc by date
|Next:||From: Dave Cramer||Date: 2004-05-26 12:11:51|
|Subject: Re: getGeneratedKeys method|
|Previous:||From: Sharib Anis||Date: 2004-05-26 09:35:27|
|Subject: Postgresql 7.4.2 and OutOfMemoryError|