I have a list of URIs of images from essentially a Wordpress site. I want to be able to have a script to get their file sizes (mb, kb, GB) from just using the URIs. I don't have access to this server-wise and need to add the sizes to a Google sheet. This seems like the fastest way to do it as there are over 5k images and attachments.
However when I do this in Python
>>> import requests
>>> response = requests.get("https://xxx.xxxxx.com/wp-content/uploads/2017/05/resources.png")
>>> len(response.content)
3232
I get 3232 bytes but when I check in Chrome Dev Tools, it's 3.4KB
What is being added? Or is the image actually 3.4KB and my script is only checking content-length?
Also, I don't want to check using the Content-Length
header as some of the images may be large and chunked so I want to be sure I'm getting the actual file size of the image.
What is a good way to go about this? I feel like there should be some minimal code or script I could run.
The value you are seeing (3.4KB) includes the network overhead such as response headers.
As a side note, I am not sure what is the version of Chrome you are using but the transfer size (including response headers) and the resource size (i.e. the file size) are displayed separately for me: