I'm attempting to produce a number from a set of bytes with JavaScript in Google Chrome, from an ArrayBuffer to get at MP3 tag information. The ID3v2 specification states that to get the tag size you must take 4 bytes at a certain location and get the integer value from them, except:
The ID3v2 tag size is encoded with four bytes where the most significant bit (bit 7) is set to zero in every byte, making a total of 28 bits. The zeroed bits are ignored, so a 257 bytes long tag is represented as $00 00 02 01.
The naive way to do this seems to be to go through each byte and get the values for each bit and produce a new 4 byte values, produced from the 7 bits from the original 4 bytes such that say for example we have these 4 original bytes:
0111 1111 0111 1111 0111 1111 0111 1111
I create a new ArrayBuffer and loop through each bit to produce:
0000 1111 1111 1111 1111 1111 1111 1111
And then I calculate the integer value from this 32bit integer using Uint32Array
Is there an easier way to do this?
If you think about it, what you've got is a 4-digit base-128 number. Each of the bytes holds a single "digit", and each "digit" is a value between 0 and 127 (inclusive). Thus, to turn them into a usable number, you just multiply and add like you'd do with any other base: the least-significant "digit" is the "one's place" digit, the next one is the "128s", the next is the "16384s", and the most-significant digit is the "2097152s" place.
I'm not sure exactly how to show this in code because I'm not really familiar with the new "ArrayBuffer" APIs; you use a "ArrayBufferView" or something to get access to the values, right? Well assuming it's easy to get the individual bytes, it should be a very simple function to do the multiplies and additions.