Search code examples
algorithmtime-complexity

How to analyze time complexity of an algorithm with different running time


I am confused about the time complexity of algorithms, since the running time of an algorithm is represented by a function of n, f(n).

For example: f(n) = 3n^2 + 5n + 1 = Θ(n^2)

But if an algorithm has multiple f(n), for example:

if statement 1 then
    for i = 1 to n do
        statement 2
        statement 3
        ...

If statement 1 is True, then f(n) = n * constant = Θ(n) If statement 1 is False, then f(n) = constant = Θ(1) How do I determine the running time of an algorithm with multiple f(n)?

I don't know how to analyze it, since big-oh big-omega and big-theta are all based on a single function f(n).


Solution

  • for things like this you can still express worst case or best case time complexity. for a better idea of how the algorithm perform on real world data you might want to find average case time complexity too