I was hoping that the program below would output 40,000 bytes (200 * 200 * 8 bit array or 1 byte) but its outputting 159800
when inserting 200
. Why?
It becomes even more confusing when it outputs 119800
when inserting 10
(subArray[j][0]=10
) and outputs 79800
when inserting 1
(subArray[j][0]=1
), even though its always 8 bit array.
If I change Uint8Array(1)
to Uint16Array(1)
the values outputted are exactly the same for the same numbers.
How do arrays work?
let x = 200
let y = 200
const changes =[]
for (let i=0;i<y;i++) {
const subArray = []
for (let j=0;j<x;j++) {
subArray[j] = new Uint8Array(1);
subArray[j][0]=200
}
changes[i]=subArray;
}
console.log(new Blob(changes).size);
Your subArray
is a plain array which happens to be filled with Uint8Array instances. If you change your code so that it's instead a 200-byte typed array, you get the answer you expect:
let x = 200
let y = 200
const changes =[]
for (let i=0;i<y;i++) {
const subArray = new Uint8Array(200);
for (let j=0;j<x;j++) {
subArray[j] = 200;
}
changes[i]=subArray;
}
console.log(new Blob(changes).size); // 40000
The spec doesn't say what happens if one of the "blob parts" is a plain array of typed arrays. It might be converting the array to a string, but how exactly that would happen internally I can't say. (e: actually I can, I think; see below.)
What I'm going by is the spec, which says what happens for 3 possibilities in the "blob parts" parameter passed to the Blob constructor (changes
in your code). Each part can be either:
So no mention of "plain array filled with whatever".
edit — if you do a little experiment, it's clear (well, pretty clear) that it's converting the plain array to a string. That will entail converting each element in the array to a string. When you convert a Uint8Array with a single byte whose value is the number 200, you get the string "200", 3 characters. Of course if you instead initialize with 10, you get "10", and 1 gives you "1". That explains (probably) the fact that the initialization values in your code give you different size answers: it's the difference in the size of the values as strings, not bytes.
let z = new Uint8Array([1, 2, 3]);
console.log(z[1]); // 2, as a number
let y = [z];
console.log(y.toString()); // '1,2,3' as a string
y = [z, z];
console.log(y.toString()); // '1,2,3,1,2,3'