Search code examples
sqlperformancepostgresqldatabase-performance

postgresql performance issues


my question is about performance with postgresql lookups. I have a table members which will hold at least 20 000 000 members and more. The system has some special members (about 10 000). I have logic doing things with single members, requireing me to query 2-3 of these special members either for sure, every time.

The question is: Might i experience performance wins when externalizing those 10 000 members into an own table, even so they are completely equal? Because the lookup for these runs way faster because there are much less rows??

EDIT: The queries are as simple as possible. Just query by primary key and joins over foreign keys.

EDIT2: To simplify the question. Is there a significant speed difference between querying through primary key from table with 30 000 000+ rows and a table 10 000 rows? Or is it all about proper indexes?

thank you in advance


Solution

  • I wouldn't bother with splitting the table at this phase until you have measurements that clearly indicates that the query for VIP members is a bottleneck.

    If later you encounter problems in this area you'll have some options to improve performance without changing need to make code changes in your application .

    One of them is - separate index for VIP members. You can create partial index that will contain only data for VIP members. This would reduce number of lookups approximately log(30000000)/log(10000) ~= 2 times.

    Another option is to use table inheritance so that your application still thinks it works with one table but in reality it is split into two physical tables.