Re: [PATCH] Log CSV by default

From: Michael Paquier <michael(at)paquier(dot)xyz>
To: Gilles Darold <gilles(at)darold(dot)net>
Cc: David Fetter <david(at)fetter(dot)org>, PostgreSQL Development <pgsql-hackers(at)postgresql(dot)org>
Subject: Re: [PATCH] Log CSV by default
Date: 2018-12-04 02:37:59
Message-ID: 20181204023759.GO3423@paquier.xyz
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

On Mon, Dec 03, 2018 at 09:20:01PM +0100, Gilles Darold wrote:
> pgBadger is able to parse jsonlog
> (https://github.com/michaelpq/pg_plugins/tree/master/jsonlog) output in
> multi-process mode because it do not use an external library to parse
> the json. In this minimalist implementation I assume that each log line
> is a full json object. Honestly I do not have enough feedback on this
> format to be sure that my basic jsonlog parser is enough, maybe it is
> not. What is really annoying for a log parser is the preservation of
> newline character in queries, if the format outputs each atomic log
> messages on a single line each, this is ideal.

jsonlog generates one JSON object for each elog call, and per line.
Newlines are treated as they would for a JSON object by being replaced
with \n within the query strings, the same way te backend handles JSON
strings as text.
--
Michael

In response to

Responses

Browse pgsql-hackers by date

  From Date Subject
Next Message Amit Langote 2018-12-04 02:38:47 Re: error message when subscription target is a partitioned table
Previous Message Michael Paquier 2018-12-04 02:23:03 Re: error message when subscription target is a partitioned table