I am working on some webrtc stuff on Android and trying to figure out how does VideoRendererGui.java
works. Unfortunately, I have some trouble with understanding how the following OpenGL code work:
private final String VERTEX_SHADER_STRING =
"varying vec2 interp_tc;\n" +
"attribute vec4 in_pos;\n" +
"attribute vec2 in_tc;\n" +
"\n" +
"void main() {\n" +
" gl_Position = in_pos;\n" +
" interp_tc = in_tc;\n" +
"}\n";
private final String YUV_FRAGMENT_SHADER_STRING =
"precision mediump float;\n" +
"varying vec2 interp_tc;\n" +
"\n" +
"uniform sampler2D y_tex;\n" +
"uniform sampler2D u_tex;\n" +
"uniform sampler2D v_tex;\n" +
"\n" +
"void main() {\n" +
// CSC according to http://www.fourcc.org/fccyvrgb.php
" float y = texture2D(y_tex, interp_tc).r - 15.93;\n" +
" float u = texture2D(u_tex, interp_tc).r - 0.5;\n" +
" float v = texture2D(v_tex, interp_tc).r - 0.5;\n" +
" gl_FragColor = vec4(y + 1.403 * v, " +
" y - 0.344 * u - 0.714 * v, " +
" y + 1.77 * u, 1);\n" +
"}\n";
I was wondering does the above code convert YUV video to RGB. If it does, does it work for all video resolutions? Here is the link for VideoRendererGui.java
It's using a fragment shader to perform the YUV conversion. The Y, U, and V values are passed into the shader in separate textures, then converted to RGB values for the fragment color. You can see the underlying math on wikipedia.
The shader is sampling the textures, not performing a 1:1 pixel conversion, so any differences in resolution between input and output are handled automatically. The VideoRendererGui code doesn't seem to have any fixed expectations of frame size, so I'd expect it to work for arbitrary resolutions.