Postgresql 9.6 and Big Data

From: Job <Job(at)colliniconsulting(dot)it>
To: "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org>
Subject: Postgresql 9.6 and Big Data
Date: 2016-12-02 08:19:58
Message-ID: 88EF58F000EC4B4684700C2AA3A73D7A08054EAEBAE5@W2008DC01.ColliniConsulting.lan
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hello,

we are planning to store historically data into a Postgresql 9.6 table.

We see on Postgresql limit that it could handle "big data".
In fact, limit for a single table is 32 Tb.

We need to archive this data to generate report and analysis views.

Anyone has experienced Postgresql 9.6 with Database and table bigger than four or five Terabytes?
Which hardware requirements has been used?

There were some problems or bottleneck with so big data?

We are evaluating Database projects concepted for big Data, such as Cassandra or MongoDb but i am personally a very affectionated PostgreSql user since about 10 years and i would like to know which performance can give in "big data".

Thank you!

Francesco

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Steven Winfield 2016-12-02 08:54:42 Re: Moving pg_xlog
Previous Message Job 2016-12-02 07:27:29 R: CachedPlan logs until full disk