Performance issues when the number of records are around 10 Million

From: venu madhav <venutaurus539(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Performance issues when the number of records are around 10 Million
Date: 2010-05-11 06:18:22
Message-ID: c7bd39d5-6f71-449b-a97b-2b8d7f9a7171@g5g2000pre.googlegroups.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hi all,
In my database application, I've a table whose records can
reach 10M and insertions can happen at a faster rate like 100
insertions per second in the peak times. I configured postgres to do
auto vacuum on hourly basis. I have frontend GUI application in CGI
which displays the data from the database.
When I try to get the last twenty records from the database,
it takes around 10-15 mins to complete the operation.This is the query
which is used:

select e.cid, timestamp, s.sig_class, s.sig_priority, s.sig_name,
e.sniff_ip, e.sniff_channel, s.sig_config, e.wifi_addr_1,
e.wifi_addr_2, e.view_status, bssid FROM event e, signature s WHERE
s.sig_id = e.signature AND e.timestamp >= '1270449180' AND
e.timestamp < '1273473180' ORDER BY e.cid DESC, e.cid DESC limit 21
offset 10539780;

Can any one suggest me a better solution to improve the performance.
Please let me know if you've any further queries.

Thank you,
Venu

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Sergey Konoplev 2010-05-11 06:51:20 Re: Performance issues when the number of records are around 10 Million
Previous Message Yan Cheng CHEOK 2010-05-11 05:38:15 Run Vacuum Through JDBC