Search code examples
data-modelingpredictiondata-analysisdata-processingreliability

Software reliability model to large data


I am trying to apply exponential SRGM to a large data which has about 50000 failure times data. This is taking forever to run and even the online tools are crashing with this data as it is too many data points. Can any of you suggest how can I solve this problem and fit Exponential (Goel-Okumoto) model to obtain MLEs (Maximum likelihood estimates) ?


Solution

  • I learnt that one best way to do this is to transform the data into failure counts format. So, I did the failure counts transformation considering equal time interval (per year), which reduced the length of my data set to 28. Then, I could apply any failure counting models to fit the data and make predictions. The article based on this study is available at https://books.google.com/books?id=uYiRDgAAQBAJ&pg=PA244&lpg=PA244&dq=An+Open+Source+Tool+to+Support+the+Quantitative+Assessment+of+Cybersecurity.+In+Proc.+International+Conference+on+Cyber+Warfare+and+Security&source=bl&ots=gJX5I0b8eH&sig=fp-EDU0z8AR1ZCVvjgqxrb1WF0c&hl=en&sa=X&ved=0ahUKEwjj1K6N09nUAhXBRCYKHZUWDfAQ6AEIMDAA#v=onepage&q=An%20Open%20Source%20Tool%20to%20Support%20the%20Quantitative%20Assessment%20of%20Cybersecurity.%20In%20Proc.%20International%20Conference%20on%20Cyber%20Warfare%20and%20Security&f=false