I had this question on an exam,
I have a very slow computer which takes one second for binary search on an array with 1,000 (one thousand) elements. How long should I expect the computer to take for binary search on an array with 1,000,000 (one million) elements?
I have trouble understanding this, would it take longer to search through 1 million elements than 1,000 or does it depend on the computer?
Binary search has O(log(N))
time complexity, completion time is
t = c * log(N, 10)
In given case for N = 1000
we know t
and thus we can find out c
1 = c * log(1000, 10)
1 = c * 3
c = 1/3
For N = 1000000
we have
t = 1 / 3 * log(1000000, N) =
= 1 / 3 * 6 =
= 2
So we can expect that binary search within 1000000
items will be completed in 2 seconds.
Please, note that O(log(N)) == O(log(N, m))
since log(N, m) == log(N, k) / log(m, k)
, which means that when working with O(log(N))
we are free to choose logarithm base. In our case (1000
and 1000000
) it's convenient to use base 10
(or 1000
)