Search code examples
algorithmasymptotic-complexity

You will be given a stream of integers


You will be given a stream of integers, and a integer k for window size, you will only receive the streams integers one by one. whenever you receive an integer, you have to return the maximum number among last k integers inclusive of current entry.

Interviewer was expecting O(N) Time + O(1) avg case space complexity solution for N tasks and integers are not given in a array, every time only one integer will be passed as input to your method.

I tried solving it but couldn't come up with O(N) solution. Can anybody tell me how can we do it.


Solution

  • O(N) time is easy, but O(1) average space is impossible.

    Here's what we absolutely need to store. For any number x we've seen in the last k inputs, if we've seen a bigger number since x, we can forget about x, since we'll never need to return it or compare it to anything again. If we haven't seen a bigger number since x, we need to store x, since we might have to return it at some point. Thus, we need to store the biggest number in the last k items, and the biggest after that, and the biggest after that, all the way up to the current input. In the worst case, the input is descending, and we always need to store all of the last k inputs. In the average case, at any time, we'll need to keep track of O(log(k)) items; however, the peak memory usage will be greater than this.

    The algorithm we use is to simply keep track of a deque of all the numbers we just said we need to store, in their natural, descending order, along with when we saw them*. When we receive an input, we pop everything lower than it from the right of the deque and push the input on the right of the deque. We peek-left, and if we see that the item on the left is older than the window size, we pop-left. Finally, we peek-left, and the number we see is the sliding window maximum.

    This algorithm processes each input in amortized constant time. The only part of processing an input that isn't constant time is the part where we pop-right potentially all of the deque, but since each input is only popped once over the course of the algorithm, that's still amortized constant. Thus, the algorithm takes O(N) time to process all input.

    *If N is ridiculously huge, we can keep track of the indices at which we saw things mod k to avoid overflow problems.