From: | Ayden Gera <aydengera(at)gmail(dot)com> |
---|---|
To: | pgsql-novice(at)lists(dot)postgresql(dot)org |
Subject: | PGDump question/issue |
Date: | 2025-04-24 21:50:57 |
Message-ID: | CANYJdW+Zq=CUbyMLiVqQ3nipS4S=W_Jn6J_bY=49qw1EDbBwyg@mail.gmail.com |
Views: | Whole Thread | Raw Message | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-novice |
Hi,
Hoping someone may have a solution to this problem.
We get a daily PGDump file (@3Gb) from our SaaS provider (for BI purpose).
In it, it has a Drop Table IF Exists command..
This file has no row level security etc.
We want to use the same file to populate Supabase with and add row level
security.. but I believe the drop table will destroy the rls each day and
manually adding it back *unless mabe scripted) isn't an option.
We have an inhouse Postgresql we can also use to potentially load and then
do its own PGDump with data only..
But the other issue we have is the source tables don't always have any
unique keys that we can tell.. so to be safe and avoid data duplicate
risk.. we prefer to delete the entire tables data before inserting..
Does anyone have any suggestions on how to best automate the daily updating
of data into the supabase tables without losing any RLS we might configure
on those tables?
Or what commands should we run on our own PG to get our own data
only/insert + commands to drop all data in all tables before running it.
I was also wondering if we could send PGDump from SaaS to Supabase Db1 and
then stream data to DB2 (Prod) but unclear if we can and/or risk data
duplication risk if we cannot somehow delete the tables in Prod just before
streaming..
Thanks in advance!
From | Date | Subject | |
---|---|---|---|
Next Message | Laurenz Albe | 2025-04-25 06:15:37 | Re: PGDump question/issue |
Previous Message | Laurenz Albe | 2025-03-06 14:23:27 | Re: Subqueries |