Search code examples
video-processingvideo-encodingadc

RGB video ADC Conversion Color Palletes


I'm trying to better understand analog to digital video conversion and was hoping for some direction. Way I understand it, a dedicated 10-bit ADC chip will read the voltage of R, G, and B input pins, translate this to 10-bit RGB and output in parallel these value across 30-pins. (Ignoring sync/clock pins etc). My question however is this: If you know the source only has 5-bits per color, (2^5)^3 = 32,768 colors, dumps this to analog RGB, and you are using a 10-bit ADC, will the ADC interpolate colors due to voltage variances and the increase from 5 to 10 bits, thus introducing unoriginal/unintended colors, or is the sampling of analog to digital truly so precise the original source color pallet will be preserved correctly?


Solution

  • Most ADCs have a 1-LSB precision, so the lowest bit will toggle randomly anyway. If you need it to be stable, either use oversampling with increased frequency, or use a 12 bit ADC, this one will have an LSB toggling as well, but bit 2 will be probably stable.

    Why probably you ask? Well, if you transmission line is noisy or badly coupled, it can introduce additional toggling in LSB range, or even higher. In some bad cases noise can even corrupt your higher 5 bits of data.

    There might be some analog filters / ferrite beads / something else to smoothen your signal as well, so you won't even see actual "steps" on analog.

    So, you never know until you test it. Try looking at your signal with a scope, that might solve some of your doubts.