I have an application that is scheduled and get some data from a another database table and dumps in my main applications database table,the number of records in this table increases daily and I assume it will grow daily as it is transactions event data that occur.This data is used for processing by the main application which takes each record and does the needed analysis and marks each record as processed.
What kind of solution can I provide so that I can keep the size of the database down in the future?
How would you go about it in this situation?
From my observation of a few enterprise applications,one provides an option that the user can archive records 'older than 60 day' etc... to a text file.Well I could provide an option to archive the processed records to a text file and delete the records from the database,the text file could be later imported if necessary? Is this a solution?
If you need to occasionally access that older data then building a process to archive it to text and then to load back from text is probably not a great solution. Hard drives are cheap.
You could aggregate the older data. For example if the transaction data is at the millisecond grain now but when you report on older data you get it by the day then consider aggregating the data to "daily" as your archiving process. You may be able to collapse hundreds of thousands of rows into a just a few for each day.
Also consider a good partitioning scheme where you can keep the most recent transactions on one set of disks and the archived data to other disks, hopefully in a process where you can easily add new disks and create tables to those disks.