Users were able to run reports before 10 am. After that same reports became very slow, sometimes users just didn't have patience to wait. After some troubleshooting I fount the column that was causing the delay. It was computed column that uses function in order to bring the result.
Approximately at the same time I got another complain about slow running report, that was always working fine. After some troubleshooting I found the columns that was causing a delay:
where (Amount - PTD) <> 0
And again, the Amount
column is computed column.
So my questions are:
why all of the sudden computed columns that was always part of the reports started to slow down the performance significantly? Even when nobody using database.
What could really happen approx after 10 am?
And what is the disadvantage if I make those columns persisted?
Thank you
You don't provide a lot of detail here - so I can only answer in generalities.
So, in general - database performance tends to be determined by bottlenecks. A query might run fine on a table with 1 records, 10 records, 1000 records, 100000 records - and then at 100001 records, it suddenly gets slow. This is because you've exceeded some boundary in the system - for instance, the data doesn't fit in memory anymore.
It's really hard to identify those bottlenecks, and even harder to predict - but keep an eye on perfmon, and see what your CPU, disk i/o and memory stats are doing.
Computed columns are unlikely to be a problem in their own right - but using them in a "where" statement (especially with another calculation) is likely to be slow if you don't have an index on that column. In your example, you might create another computed column for (Amount - PTD)
and create an index on that column too.