Skip site navigation (1) Skip section navigation (2)

Re: Million of rows

From: Vinicius Bernardi <vinicius(dot)bernardi(at)gmail(dot)com>
To: pgsql-performance(at)postgresql(dot)org
Subject: Re: Million of rows
Date: 2005-03-29 21:10:44
Message-ID: (view raw, whole thread or download thread mbox)
Lists: pgsql-performance
At now I have this system runing in a mysql, all the examples I have
are in mysql, but the biggest client that will start now, we will use
PostgreSQL, so I need a way to do those questions in postgres...
Ideas like TABLESPACES or anothe things...

Just looking for start ideas...


Vinicius Marques De Bernardi

On Tue, 29 Mar 2005 12:08:15 -0700, Michael Fuhr <mike(at)fuhr(dot)org> wrote:
> On Tue, Mar 29, 2005 at 03:33:24PM -0300, Vinicius Bernardi wrote:
> >
> > But now the problem starts when I has to select data from this
> > vehicles about the history ( I store only 2 months ) something like 40
> > or 50 millions of data about 500 vehicles.
> >
> > Using the keys VEHICLE_ID and GPS_TIME, the perfomance is very low...
> Please post an example query and the EXPLAIN ANALYZE output.  The
> table definition might be useful too.
> > I need some ideas for a better perfomance in this table
> Do you have indexes where you need them?  Do you cluster on any of
> the indexes?  Do you VACUUM and ANALYZE the database regularly?
> Have you investigated whether you need to increase the statistics
> on any columns?  Have you tuned postgresql.conf?  What version of
> PostgreSQL are you using?
> --
> Michael Fuhr

In response to

pgsql-performance by date

Next:From: Karim NassarDate: 2005-03-30 00:52:58
Subject: VACUUM on duplicate DB gives FSM and total pages discrepancies
Previous:From: Michael FuhrDate: 2005-03-29 19:08:15
Subject: Re: Million of rows

Privacy Policy | About PostgreSQL
Copyright © 1996-2017 The PostgreSQL Global Development Group