From: | Gilles Darold <gilles(at)darold(dot)net> |
---|---|
To: | David Fetter <david(at)fetter(dot)org>, PostgreSQL Development <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: [PATCH] Log CSV by default |
Date: | 2018-12-03 18:18:31 |
Message-ID: | d574f216-eb8a-cd45-96c3-fee0bc1754a1@darold.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Le 30/11/2018 à 19:53, David Fetter a écrit :
> This makes it much simpler for computers to use the logs while not
> making it excessively difficult for humans to use them.
I'm also not very comfortable with this change. For a pgBadger point of
view I don't know an existing library to parse csv log in multi-process
mode. The only way I know is to split the file or load chunks in memory
which is not really recommended with huge log files. I am not saying
that this is not possible to have a Perl CSV library allowing
multi-process but this require some additional days of work. Also at
this time I have not received any feature request for that, maybe the
use of csvlog format is not so common.
Regards,
--
Gilles Darold
http://www.darold.net/
From | Date | Subject | |
---|---|---|---|
Next Message | David Fetter | 2018-12-03 18:25:26 | Re: [PATCH] Log CSV by default |
Previous Message | Pavel Stehule | 2018-12-03 18:01:28 | Re: [proposal] Add an option for returning SQLSTATE in psql error message |