Node buffer reads signed values, but how does it know which algorithm to use? There are three ways, at least, to store a negative number in binary: 2s complement, 1s complement, and traditional.
For example, 1111 could be -7, -0, or -1.
How can Node know how to "unpack" the value and correctly convert it to a negative value? They all LOOK the same, and there's no way to configure when the buffer is initialized.
Nevermind. Just read the docs and Node interprets all signed ints as 2s complement.
Hope this helps someone out in any case.