I just didn't get it, why the time complexity is O(n^2) instead of O(n*logn)? The second loop is incrementing 2 each time so isn't it O(logn)?
void f3(int n){
int i,j,s=100;
int* ar = (int*)malloc(s*sizeof(int));
for(i=0; i<n; i++){
s=0;
for(j=0; j<n; j+=2){
s+=j;
printf("%d\n", s);
}
free(ar);
}
By incrementing by two, rather than one, you're doing the following N*N*(1/2)
. With big(O) notation, you don't care about the constant, so it's still N*N. This is because big(O) notation reference the complexity of the growth of an algorithm.