From: | PG Bug reporting form <noreply(at)postgresql(dot)org> |
---|---|
To: | pgsql-bugs(at)lists(dot)postgresql(dot)org |
Cc: | dsl0530(at)hotmail(dot)com |
Subject: | BUG #15730: Using filter function to interpolate data, the length limit of table field is invalid |
Date: | 2019-04-03 03:58:15 |
Message-ID: | 15730-89c190533b542f2e@postgresql.org |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-bugs |
The following bug has been logged on the website:
Bug reference: 15730
Logged by: ShuLin Du
Email address: dsl0530(at)hotmail(dot)com
PostgreSQL version: 9.6.6
Operating system: RedHat Enterprise linux 7.3
Description:
I use the pg_bulkload(3.1.14) to load the file data into table .
The input data(file data) needs to be processed by filtering function(Define
filter function in CTL file) before insert into table.
But I found that if I used a filter function to process input data,then
output to db table,the maximum length limit for table fields is invalid.
filter function as below (input data :2 items, output data: 3 items):
create function FVUA001(varchar,varchar) returns record as $BODY$
select row($1,$2,null) $BODY$
language sql volatile
cost 100;
But, Even if the length of a field in the input data is longer than the
length of the table definition,data will also be successfully inserted into
tables. So I wonder if this would be a postgreSQL bug.
Please help me solve this problem. Thanks a lot.
From | Date | Subject | |
---|---|---|---|
Next Message | PG Bug reporting form | 2019-04-03 07:38:50 | BUG #15731: CVE-2019-9193 |
Previous Message | Thomas Munro | 2019-04-02 19:54:34 | Re: BUG #15726: parallel queries failed ERROR: invalid name syntax CONTEXT: parallel worker |