Re: Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs

From: Julien Rouhaud <rjuju123(at)gmail(dot)com>
To: lupeng <lpmstsc(at)foxmail(dot)com>
Cc: pgsql-hackers <pgsql-hackers(at)lists(dot)postgresql(dot)org>
Subject: Re: Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs
Date: 2022-03-15 12:38:34
Message-ID: 20220315123834.o2m4ght772fkg7cy@jrouhaud
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

Hi,

On Tue, Mar 15, 2022 at 09:31:19AM +0800, lupeng wrote:
>
> When I audit the Postgresql database recently, I found that after configuring
> the log type as csv, the output log content is as follows: "database
> ""lp_db1"" does not exist",,,,,"DROP DATABASE lp_db1;",,"dropdb,
> dbcommands.c:841","","client backend",,0 It is very inconvenient to
> understand the real meaning of each field. And in the log content," is
> escaped as "", which is not friendly to regular expression matching.
> Therefore, I want to modify the csv log function, change its format to
> key:value, assign the content of the non-existing field to NULL, and at the
> same time, " will be escaped as \" in the log content. After the
> modification, the above log format is as follows: Log_time:"2022-03-15
> 09:17:55.289
> CST",User_name:"postgres",Database_name:"lp_db",Process_id:"17995", [...]

This would make the logs a lot more verbose, and a lot less easy to process if
you process them with tools intended for csv files.

You should consider using the newly introduced jsonlog format (as soon as pg15
is released), which seems closer to what you want.

In response to

Browse pgsql-hackers by date

  From Date Subject
Next Message hubert depesz lubaczewski 2022-03-15 13:30:32 Re: Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs
Previous Message Joseph Koshakow 2022-03-15 12:14:18 Re: Can we consider "24 Hours" for "next day" in INTERVAL datatype ?