From: | 高健 <luckyjackgao(at)gmail(dot)com> |
---|---|
To: | Jeff Janes <jeff(dot)janes(at)gmail(dot)com> |
Cc: | pgsql-general <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: My Experiment of PG crash when dealing with huge amount of data |
Date: | 2013-09-02 01:25:43 |
Message-ID: | CAL454F0wXbvXSmiV7qm0dvGtArbR0jp2yrgjMi1uzqm8AE0eig@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
>To spare memory, you would want to use something like:
>insert into test01 select generate_series,
>repeat(chr(int4(random()*26)+65),1024) from
>generate_series(1,2457600);
Thanks a lot!
What I am worrying about is that:
If data grows rapidly, maybe our customer will use too much memory , Is
ulimit command a good idea for PG?
Best Regards
2013/9/1 Jeff Janes <jeff(dot)janes(at)gmail(dot)com>
> On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao(at)gmail(dot)com> wrote:
> >
> >
> > postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> > chr(int4(random()*26)+65),1024));
>
> The construct "values (srf1,srf2)" will generate its entire result set
> in memory up front, it will not "stream" its results to the insert
> statement on the fly.
>
> To spare memory, you would want to use something like:
>
> insert into test01 select generate_series,
> repeat(chr(int4(random()*26)+65),1024) from
> generate_series(1,2457600);
>
> Cheers,
>
> Jeff
>
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2013-09-02 02:37:07 | Re: My Experiment of PG crash when dealing with huge amount of data |
Previous Message | Adrian Klaver | 2013-09-02 00:59:47 | Re: store multiple rows with the SELECT INTO statement |