Suppose an analog audio signal is sampled 16,000 times per second, and each sample is quantized into one of 1024 levels. What would be the resulting bit rate of the PCM digital audio signal?
so that a question in Top down approach book , i answered it but just want to make sure it is correct
my answer is
1024 = 2 ^10
so PCM bit rate = 10 * 16000 = 160 , 000 bps
is that correct
Software often makes the trade off between time and space. Your answer is correct, however to write software you typically read/write data into storage units of bytes (8 bits). Since your answer says 10 bits, your code would use two bytes (16 bits) per sample. So the file consumption rate would be 16 * 160000 == 256000 bits per second (32000 bytes per second). This is mono so stereo would double this. Your software to actually store 10 bits per sample instead of 16 bits would shift this time/space trade-off in the direction of increased computational time (and code complexity) to save storage space.