Why capacity has reduced to 10 instead of 8?
#Do not remove the below import statement
import sys
'''This function provides the capacity, size and space left in the list.
You can invoke it to get the details of the list'''
def list_details(lst):
#Number of elements that can be stored in the list
print("Capacity:", (sys.getsizeof(lst)-36)//4)
#Number of elements in the list
print("Size:", len(lst))
#Number of elements that can be accommodated in the space left
print("Space Left:", ((sys.getsizeof(lst)-36) - len(lst*4))//4)
#formula changes based on the system architecture
#(size-36)/4 for 32 bit machines and
#(size-64)/8 for 64 bit machines
# 36, 64 - size of an empty list based on machine
# 4, 8 - size of a single element in the list based on machine
marias_lst=[]
print("Empty list created!!!")
print("List details:")
list_details(marias_lst)
for i in range(0,10):
marias_lst.append(1)
print("List details After adding 10 elements :")
list_details(marias_lst)
for i in range(0,3):
marias_lst.remove(1)
print("List details after removing 3 elements:")
list_details(marias_lst)
I am using the above program to understand how the growth of lists in python takes place. My doubt is when I add 1 element, the capacity raises to 4 I add 5 elements, the capacity raises to 8 I add 10 elements, the capacity raises to 16
now when I am removing 3 elements after adding 10 elements, I get the following output
Empty list created!!!
List details:
Capacity: 0
Size: 0
Space Left: 0
List details After adding 10 elements :
Capacity: 16
Size: 10
Space Left: 6
List details after removing 3 elements:
Capacity: 10
Size: 7
Space Left: 3
Why not the capacity is 8 and space left 1?
**EDIT 1 ** on a 32-bit machine python interpreter, Our list growth is like demonstrated below
>>> import sys
>>> sys.getsizeof([])
36
>>> sys.getsizeof([1])
40
>>> lst = []
>>> lst.append(1)
>>> sys.getsizeof(lst)
52
There's no reason to expect the capacity to be 8. There's also no reason to expect the capacity to be 10 again if you run this on a new Python version, or a different implementation (like PyPy). The fact that it happened to be 10 is an implementation detail you should not rely on and not expect to remain unchanged.
The capacity happened to be 10 because remove
-ing down to less elements than half the capacity triggered a shrinkage, and (for now, on modern CPython) the resize routine calculates the overallocation as
new_allocated = (newsize >> 3) + (newsize < 9 ? 3 : 6);
When newsize
is 7, this produces a 3-element overallocation, for a new capacity of 10.