We have Sonarqube version 6.7.1, with Oracle DB. We see that the table PROJECT_MEASURES
is full of a huge amount of records: 130216515.
What is the best way to maintain clean up there? currently, it is causing many failures in the job with timeout for Sonarqube
Example from today 12:15 to 12:30:
430,574 rows were inserted to that table, 1,300,848 were deleted.
As we suspected, the issue came from PROJECT_MEASURES bad performance. The steps we did to improve it:
- A new index was added to the table for ANALYSIS_UUID_CUSTOM_IDX2
- Afterward, rebuild the indexes
- db-cache was 300MB, where the minimum we allocated was 2GB. We increased it to 4GB (the DB server has 16GB RAM)
- Redologs files – size was 300MB, we increased to 1GB
- Increase the cache of the sequence from the default 20 to 1000
- Shrink the PROJECT_MEASURES table with the COMPACT option
after it, scans worked much faster, and all builds passed with sonarqube stage