Consider the following pseudocode:
linked_list_node = ... //We have some linked list
while linked_list_node is not NULL //Iterate through it
node_copy = CopyNode(linked_list_node) //We allocate a new pointer and copy the node, which is O(1) space
... //Do something
DeleteAndFree(node_copy) //We free the memory allocated at the beginning of the loop
Next(linked_list_node) //Advance once
Let N be the size of our linked list
At each iteration of the loop, we used O(1) space, the loop is N iterations, which means that in total we allocated O(N) space
We never actually allocated N nodes at the same time, each time we allocated exactly one node, so, in theory, we only O(1) Space. In other words, if our machine only had 1 byte available in memory, it could allocate and delete that same byte over and over again, never running into a memory limit.
I found this question on stack overflow: What is O(1) space complexity?
From accepted answer:
However, if let's say for some reason the algorithm needs to allocate 'N' pointers when traversing a list of size N, ..., then the algorithm is considered to have a space complexity of O(N)
It seems like my algorithm doesn't satisfy this condition as I never actually use N different pointers at the same time, so it should be O(1) Space. However, it indeed requires N allocating operations, which could be why it is really O(N) Space
So, what is the space complexity of that and why?
This would be O(1) complexity. At each step you can say that your usage increases and decreases by 1, so net gain of 0 for each element. That said, this is kind of an odd solution, as you presumably could replace it with an implementation where you allocate a single node, and just copy each element into it. The two would be equivalent, and the latter is clearly O(1) space complexity.