This is slightly off-topic, but the PostgreSQL cognoscenti is likely to be
the best audience for the question.
I am writing an application -- an application-specific (scaled down) Web
server in Node.js. A question I keep asking myself is whether it it better
to simply log incoming requests, or write them to a database. Ultimately,
I would like to analyze the data, so moving into a database makes sense.
However, I also can see the point of logging as a simpler, less CPU & I/O
intensive activity. When life goes wrong, capturing data in a log file may
be the easier. Plus, resources are freed to handle requests which is the
fundamental goal of a Web server anyways.
Yet if the database resides on another machine, this dismisses some of the
CPU - I/O load argument.
I can't believe that the frequency data is acquired is the determinant, but
I may be wrong.
I am also aware that there are various tools available for parsing log file
data into databases, & writing such tools is not altogether complicated.
Nevertheless, this seems to be a redundant exercise.
So, I come back full circle. Even PostgreSQL itself has its log files.
Not everything is written to database tables proper. Yet at what point
does data take on a new status such that it should be collected in a
database over simply being written to logs?
Thanks for all candor shared.
pgsql-novice by date
|Next:||From: Daniel Staal||Date: 2012-01-12 00:28:00|
|Subject: Re: When should log events be captured in a database?|
|Previous:||From: Bob Branch||Date: 2012-01-11 23:37:23|
|Subject: Join troubles between pg_index and pg_indexes with capitalization