Search code examples
optimizationpremature-optimization

In terms of today's technology, are these meaningful concerns about data size?


We're adding extra login information to an existing database record on the order of 3.85KB per login.

There are two concerns about this:

1) Is this too much on-the-wire data added per login?

2) Is this too much extra data we're storing in the database per login?

Given todays technology, are these valid concerns?

Background:

We don't have concrete usage figures, but we average about 5,000 logins per month. We hope to scale to larger customers, howerver, still in the 10's of 1000's per month, not 1000's per second.

In the US (our market) broadband has 60% market adoption.


Solution

  • Assuming you have ~80,000 logins per month, you would be adding ~ 3.75 GB per YEAR to your database table.

    If you are using a decent RDBMS like MySQL, PostgreSQL, SQLServer, Oracle, etc... this is a laughable amount of data and traffic. After several years, you might want to start looking at archiving some of it. But by then, who knows what the application will look like?

    It's always important to consider how you are going to be querying this data, so that you don't run into performance bottlenecks. Without those details, I cannot comment very usefully on that aspect.

    But to answer your concern, do not be concerned. Just always keep thinking ahead.