Search code examples
javaandroidgpurgbyuv

Android Camera Preview YUV format into RGB on the GPU


I have copy pasted some code I found on stackoverflow to convert the default camera preview YUV into RGB format and then uploaded it to OpenGL for processing. That worked fine, the issue is that most of the CPU was busy at converting the YUV images into the RGB format and it turned into the bottle neck.

I want to upload the YUV image into the GPU and then convert it into RGB in a fragment shader. I took the same Java YUV to RGB function I found which worked on the CPU and tried to make it work on the GPU.

It turned to be quite a little nightmare, since there are several differences on doing calculations on Java and the GPU. First, the preview image comes in byte[] in Java, but bytes are signed, so there might be negative values.

In addition, the fragment shader normally deals with [0..1] floating values for instead of a byte.

I am sure this is solveable and I almost solved it. But I spent a few hours trying to figure out what I was doing wrong and couldn't make it work.

Bottom line, I ask for someone to just write this shader function and preferably test it. For me it would be a tedious monkey job since I don't really understand why this conversion works the way it is, and I just try to mimic the same function on the GPU.

This is a very similar function to what I used on Java: Displaying YUV Image in Android

What I did some of the job on the CPU, such as turnning the 1.5*wh bytes YUV format into a wh*YUV, as follows:

static public void decodeYUV420SP(int[] rgba, byte[] yuv420sp, int width,
        int height) {
    final int frameSize = width * height;

    for (int j = 0, yp = 0; j < height; j++) {
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;
        for (int i = 0; i < width; i++, yp++) {
            int y = (int) yuv420sp[yp]+127;
            if ((i & 1) == 0) {
                v = (int)yuv420sp[uvp++]+127;
                u = (int)yuv420sp[uvp++]+127;
            }
            rgba[yp] = 0xFF000000+(y<<16) | (u<<8) | v;
        }
    }
}

I added 127 because byte is signed. I then loaded the rgba into a OpenGL texture and tried to do the rest of the calculation on the GPU.

Any help would be appreaciated...


Solution

  • I used this code from wikipedia to calculate the conversion from YUV to RGB on the GPU:

    private static int convertYUVtoRGB(int y, int u, int v) {
        int r,g,b;
    
        r = y + (int)1.402f*v;
        g = y - (int)(0.344f*u +0.714f*v);
        b = y + (int)1.772f*u;
        r = r>255? 255 : r<0 ? 0 : r;
        g = g>255? 255 : g<0 ? 0 : g;
        b = b>255? 255 : b<0 ? 0 : b;
        return 0xff000000 | (b<<16) | (g<<8) | r;
    }
    

    I converted the floats to 0.0..255.0 and then use the above code. The part on the CPU was to rearrange the original YUV pixels into a YUV matrix(also shown in wikipdia). Basically I used the wikipedia code and did the simplest float<->byte conersions to make it work out. Small mistakes like adding 16 to Y or not adding 128 to U and V would give undesirable results. So you need to take care of it. But it wasn't a lot of work once I used the wikipedia code as the base.