Search code examples
excelazure-sql-databasehigh-volume

Reporting a huge volume of data


We run a SAAS product. One of our customer is an enterprise client and while others generate 8-10,000 rows of data for a bill report for 6 months. They generate about 300,000 rows of data. Each row has 30 columns. So when they try to goto our reporting module and export their 6 months report, our servers go unresponsive as the CPU jumps to 80%, and the entire process fails. Fetching 300K from SQL database after joins etc takes 30 minutes, followed by 3 hours to write the 300K rows with 30 columns in excel. My worry is that just 1 client is bringing the entire site down, as we have 40 reports, and if they start going back 6 months and exporting data, other users will have a terrible experience.

What is the best way to handle such large exports of data in SAAS products?

Tx


Solution

  • Replicate your production database into a "end-user's sand-box database", serve your end-users' reporting from there, where they can slow down the DB to a crunching halt, if they need to.

    Such large exports shouldn't be done from a production system.