I just get this algorithm from another post, but I need to know how can I calculate the temporal complexity
of this algorithm? I am a student and don't know much about how to do it.
public static void getSum(int[] numbersArray, int starting, int sum)
{
if(numbersArray.length == starting)
{
// Now we print sum here
System.out.println(sum);
return;
}
int value = sum + numbersArray[starting];
getSum(numbersArray, starting + 1, value);
getSum(numbersArray, starting + 1, sum);
}
Suppose numbersArray
has length 5. I'm assuming that someone is going to call this with starting
= 0, i.e. getSum(numbersArray, 0, 0)
. Now:
This is going to call getSum(numbersArray, 1, x)
twice.
Each call of getSum(numbersArray, 1, x)
will call getSum(numbersArray, 2, x)
twice.
Each call of getSum(numbersArray, 2, x)
will call getSum(numbersArray, 3, x)
twice. And so on.
When we get to getSum(numbersArray, 5, x)
, the time is constant; let's call it baseTime.
Suppose we say T(n) is the time to execute getSum(numbersArray, n, x)
. Then T(5) is baseTime; T(4) = 2*T(5); T(3) = 2*T(4); and so on. This means that the time for the entire call, T(0) = 2*T(1) = 2*2*T(2) = 2*2*2*T(3) = 2*2*2*2*T(4) = 2*2*2*2*2*T(5) = 2*2*2*2*2*baseTime. What this tells you is that the time for the algorithm is proportional to 2m where m is the length of the array. So the answer would be written as O(2m).